(navigation image)
Home American Libraries | Canadian Libraries | Universal Library | Community Texts | Project Gutenberg | Children's Library | Biodiversity Heritage Library | Additional Collections
Search: Advanced Search
Anonymous User (login or join us)
Upload
See other formats

Full text of "Assessing the environmental literacy of intro environmental science students"

ASSESSING THE ENVIRONMENTAL LITERACY OF INTRO ENVIRONMENTAL 

SCIENCE STUDENTS 

By 

Randi Corrine Hogden 

B. S., Metropolitan State College of Denver, 2010 



A thesis submitted to the 

University of Colorado Denver 

in partial fulfillment 

of the requirements for the degree of 

Masters of Science 

Environmental Science 

2012 



This thesis for the Masters of Science 

Degree by 

Randi Corrine Hogden 

has been approved by 



Bryan Shao-Chang Wee 



Robert Talbot 



Casey Allen 



April 9, 2012 
Date 



Hogden, Randi Corrine (M.S., Environmental Science) 

Assessing the Environmental Literacy of Intro Environmental Science Students 

Thesis directed by Bryan Shao-Chang Wee 

ABSTRACT 
Using an assessment tool, tailored to the Colorado academic science standards, a study was 
conducted to evaluate the environmental literacy of postsecondary, nonscience majors. Data were 
collected from 144 students taking an introductory environmental science class. A 16-item, 
multiple-choice question, environmental knowledge assessment instrument covered environmental 
content across three subdomains in the Colorado academic science standards: Physical Science, 
Life Science and Earth Systems Science. Population total mean scores were compared to sub- 
domain scores to assess students' overall environmental literacy as well as to identify the 
populations' weaknesses between the sub-domains. Results showed that the total mean score for 
the class was 52.18%, which indicates that the population as a whole does not have a strong 
foundation in environmental science nor high levels of environmental literacy and need further 
assistance in one or more of the three sub-domains. Statistical analysis revealed that on average 
the students scored a 67.8% in Physical Science, 53.4% in Life Science, and 37.8% in Earth 
Systems Science. Given that the findings were limited to environmental knowledge within the 
Colorado science standards, an assessment of environmental knowledge in social science 
standards, including measures of behavior, attitudes and dispositions toward the environment is 
warranted. 

Keywords: assessment; environmental education; environmental literacy; environmental science; 
environmental knowledge; Colorado State Science Standards 



ACKNOWLEDGMENT 
Without the support and guidance of Dr. Bryan Wee, this research project would have never 
materialized. You have shaped my mind, my awareness, my spirit and my path. Thank you for 
choosing and believing in me. Thank you to James, my husband, for the comfort and hope you've 
given me, laughter we've shared daily, and your willingness to endure unceasing hours of silence 
whilst I studied, wrote and researched. To Yvette and Adam, I love you both and am indebted to 
you countless home cooked meals and dish washings. 



TABLE OF CONTENTS 

Figures iii 

Tables iv 

Chapter 

1 . Prologue 5 

2. Introduction to Literature Review 6 

2.1 Brief history: 

Environmental Education 6 

Environmental Literacy 8 

2.2 Definitions of literacy 10 

2.3 Definitions of Science Literacy 10 

2.4 Definitions of Environmental Literacy 12 

2.5 Current demand for EE and EL 14 

2.6 Measuring EL with State Standards 17 

3. Methods 20 

3.1 Introduction to assessment 20 

3.2 Creation of AELIESS 21 

3 .3 Identification of measure , 22 

General Information 22 

Purpose(s) of measure 

Specific sub-domains assessed 

Intended test population 24 

Age 

Special groups 
Administration 24 



Time required 24 

Stimulus items 24 

Administration Procedures 25 

Scoring Procedures 25 

Interpretation procedures 26 

3 .4 Support for measure 26 

Item selection 26 

Validity evidence 27 

Reliability 28 

4. Results and Discussion 29 

5 . Implications and Conclusion 45 

5.1 Challenges for Education 45 

5 .2 Limitations of Assessment 46 

5 .3 Dispositions towards the environment 49 

5 .4 Environmental values and beliefs 50 

6. Epilogue 52 

Appendix 

A. Geographic Dispersion of survey respondents 53 

B. Introduction to Environmental Science Syllabus 55 

C. AELIESS assessment instrument and Answers 58 

D. AELIESS questions chosen using Colorado academic standard outline 62 

E. Studies assessing aspects of EL 65 

F. EL contexts and distributions 68 

G. IRB approval letter 70 

Bibliography 72 



11 



FIGURES 

Figure 

2.1 Geographic dispersion of survey respondents 54 

3 .2 ENVS 1042: Introduction to Environmental Science Syllabus 56 

3 .3 AELIESS assessment instrument 59 

4.1 Difficulty and Discrimination Distributions 29 

4.2 Mean Scores for Age Groups 40 

4.3 Sub-domain scores compared to total mean score 42 

5.1 PISA Framework for Assessing Environmental Literacy 48 



111 



TABLES 



Table 



2.1 AELIESS Questions chosen using the Colorado Academic Standard's outline of critical 

concepts and skills for K-12 63 

3.1 A selection of studies that assess instructional effectiveness concerning aspects of EL. ..66 

3 .2 Contexts for environmental literacy 69 

3 .3 Distributions of contexts 69 

4.1 Total Variance Explained 31 

4.2 Principal Component Analysis 32 

4.3 Cronbach's Alpha Case processing summary 33 

4.4 Cronbach's Alpha Reliability 33 

4.5 Cronbach's Alpha Item-Total Statistics 34 

4.6 Demographic information including percents of represented ethnicities 35 

4.7 Independent t-test between men and women's scores 36 

4.8 Group statistics for men and women 36 

4 .9 Independent t-test between high school graduates and non-graduates 37 

4.10 Group statistics for high school graduates and non-graduates 37 

4.11 Independent t-test for K-12 Colorado and non-Colorado attendees 38 

4.12 Group statistics for K-12 Colorado and non-Colorado attendees 38 

4.13 Descriptives on a One-way ANOVA for Age and Average scores 39 

4.14 One-way ANOVA for Age and Average scores 40 

4.15 Independent t-test between individuals 18 to 20 years old and those 21 to 39 years olds. 41 

4.16 Group statistics for ages 18 to 21 and 21-39 42 



IV 



1 . Prologue 

Currently, there is not research being conducted on state content standards and how they 
relate to environmental literacy. Although we have created exceptional environmental frameworks 
and tools for measuring environmental literacy, the assessments are disconnected from the 
academic standards. It is not rational to expect any educator to stray from the academic standards 
they have been given by the state to follow a separate environmental literacy plan. Unfortunately, 
the all too common attitude is that, if it will not be tested, it will not be taught. If we want to 
measure environmental literacy of students, we must draw from what they are actually being 
taught. Environmental knowledge, of natural and human systems, has been incorporated into the 
Colorado Science Standards. Why not use these same standards as a baseline for the 
environmental assessment? It only makes sense. 

The proposed research examines Intro to Environmental Science students and their 
understanding of environmental science knowledge and concepts. The research seeks to answer 
the question: Do post-secondary students possess the environmental knowledge they were taught 
in Kindergarten through twelfth grade (K-12)? Having a clear understanding of the foundational 
concepts, such as the interaction of natural and human systems, is an important aspect of 
environmental literacy. Once the more quantitative foundational concepts are understood, this 
enables the educator to instruct from a more qualitative angle. This approach is known as the T- 
educational approach (Golley, 1998). The arms are broad and the stem deep. The ultimate goal of 
Intro to Environmental Science is to grow individuals with operational environmental literacy. The 
measured, foundational knowledge highlights normal and memorable patterns of environmental 
relationships and organization of observations, interpretations and generalizations. The research 
includes the use of an assessment tool, AELIESS, created using the new Colorado Department of 
Education K-12 Academic Standards. The research supplies environmental educators with a 
practical assessment tool. 



2. Introduction to Literature Review 

2.1 Brief history: 

Environmental Education (EE) 

It is acknowledged that the primary antecedents of Environmental Education (EE) were 

Nature Study, Outdoor Education, and Conservation Education (Disinger, 1985). The term 

Environmental Education has been so vaguely defined over the years that it has been used 

synonymously with many different constructs: environmental-ecological education, ecological 

education, conservation education, camping education, outdoor education and environmental 

science education (Disinger, 1985). One of the most renowned experts on EE, Harold 

Hungerford, has concluded that EE is not synonymous with the previous fields, but that it has been 

defined and given substantive structure and boundaries (Hungerford, 1975). The definition that 

Hungerford (2005) uses, because of its easy and clarity, is from the Federal Register and states 

that: 

Environmental education is a process that leads to responsible individual and 
group actions... Environmental education should enhance critical thinking, 
problem solving, and effective decision-making skills. Environmental education 
should engage and motivate individuals as well as enable them to weigh various 
sides of an environmental issue to make informed and responsible decisions 
(US EPA, 1992, p. 475 16). 

EE became a common phrase and topic of interest in the 1960's and 70's. This topic of 

interest quickly turned into efforts to compose a conceptual framework for EE, built on shaping 

attitudes, motivations and skills (Hart, 1981; Harvey, 1977a; Hungerford, Peyton, & Wilke, 1980; 

Stapp et al., 1969; UNESCO, 1977). In 1978 the world's first Intergovernmental Conference on 

Environmental Education, organized by UNESCO in cooperation with the United Nations 

Environment Programme (UNEP) was convened in Tbilisi, Georgia (USSR). At the close of the 

conference, the Tbilisi Declaration was adapted by acclamation. Within the document, among the 

goals and guiding principles of EE, were the five categories of objectives. The Tbilisi EE 



categories, which provided a solid EE framework for almost two decades, included Awareness, 
Knowledge, Affect, Skills, and Participation (UNESCO, 1978). 

Awareness: to help social groups and individuals acquire an awareness and sensitivity to 

the total environment and its allied problems. 

Knowledge: to help social groups and individuals gain a variety of experience in, and 

acquire a basic understanding of, the environment and its associated problems. 

Attitudes: to help social groups and individuals acquire a set of values and feelings of 

concern for the environment and the motivation for actively participating in 

environmental improvement and protection. 

Skills: to help social groups and individuals acquire the skills for identifying and solving 

environmental problems . 

Participation: to provide social groups and individuals with an opportunity to be actively 

involved at all levels in working toward resolution of environmental problems 

(Hungerford, Bluhm, Volk & Ramsey, 2005 p. 15). 

In his Ph.D. dissertation entitled Environmental Education: A Delineation of Substantive 

Structure, Gary Harvey (1977) constructed the generally accepted definition of EE, which has 

endured centuries of rigorous disassembling and evaluation. This is the definition most experts in 

the field refer to (Disinger, 1985). Hungerford also refers to and accepts this mediating definition 

as an alternate to the Federal Register's (Hungerford, Peyton & Wilke, 1983). After a thorough 

review of the literature, Harvey defined EE as: 

An interdisciplinary, integrated process concerned with resolution of 
values conflicts related to the man-environment relationship, through 
development of a citizenry with awareness and understanding of the 
environment, both natural and man-altered. Futher, this citizenry will be able 
and willing to apply enquiry skills, and implement decision-making, problem- 
solving, and action strategies toward achieving/maintaining homeostasis 
between quality of life and quality of environment (Harvey, 1977b, p. 158). 



For the purpose of this research, Harvey's definition brings in an important concept of 
interdisciplinary processes, which is lacking in the U.S. EPA definition. This concept is 
foundational to the research assessment tool and is covered under Implications and Conclusion, 
section 5. 

Environmental Literacy (EL) 

The concept of Environmental Literacy (EL) has been evolving since it was developed, to 
advance the field of EE, in 1969 (Roth, 1992). The term gained great attention when President 
Richard Nixon began using it in his speeches for the National Environmental Education Act. In 
1992 interpretive scientist Charles E. Roth, who first introduced EL to the world, presented the 
three major levels of EL: nominal EL, functional EL, and operational EL (Roth, 1992). Roth gave 
environmental literacy a purpose in society. For the first time, EL was seen as a continuum based 
on knowledge, values, beliefs and actions. Hungerford and Tomara (1977), considered an 
environmentally literate citizenry as both competent and willing to take action on critical issues. 
Roth (1992) also emphasized the need for knowledgeable citizens, who took action, who worked 
to solve human/environment issues such as population growth, nonrenewable resources, 
consumption, pollution and social injustice. EL became a common term used in schools and 
academic boards across the nation when the American Society for Testing and material (ASTM) 
developed consensus standards on EE with a clear definition for EL. 

EE and EL took another great leap when Dr. Deborah Simmons developed a new 
framework for environmental literacy . This framework was based on seven common clusters of 
elements: 

(1) Affect- environmental sensitivity, attitudes, values, motivation and moral reasoning 

(2) Ecological Knowledge 



(3) Socio-Political Knowledge- the relationship of cultural, political, economic, religious and other 
social factors influencing perceptions and activities 

(4) Knowledge of Environmental Issues 

(5) Skills- environmental problems/issues and action/service (analyze, investigate, evaluate) 

(6) Determinants of Environmentally Responsible Behavior- locus of control/efficacy, and 
assumption of personal responsibility 

(7) Behavior- various forms of active participation in solving problems and resolving issues 
(Simmons, 1995). 

Since 1995, environmental literacy assessment instruments have been published (Wilke, 
1995) as well as several national studies using assessments of environmental literacy (e.g., 
Erdogan, 2009; McBeth, 20010; Negev et al., 2008; Shin et al., 2005), however, many of these 
studies have been conducted on middle school students. Simmons (1995) framework is still 
influential today and has been used in proceeding research by Volk and McBeth, (1998), as well 
as by the National Guidelines for Excellence Project to develop guidelines for state standards. On 
December 1 , 201 1 , NAAEE released Developing a Framework for Assessing Environmental 
Literacy at the National Press Club in Washington, DC, which although still needs some work, is 
the most promising national framework the country has seen in decades. In 1997 the Organization 
for Economic Co-operation and Development (OECD) started the Programme for International 
Student Assessment (PISA) (Hollweg et al., 201 1). Over 70 countries have participated in the 
PISA surveys, which test reading, mathematical and scientific literacy in terms of general 
competencies. The age group of tested students is between 15 years 3 months and 16 years 2 
months, an age right before many students in European countries end compulsory education. On 
August 28, 201 1 , PISA proposed a framework for assessing EL in 2015. This will be the largest 
international research project ever conducted in EL. 



2.2 Definitions of Literacy 

Individuals are either illiterate or literate, the difference separated by a threshold of 
reading and writing skills. Literacy has been further subdivided into four categories: conventional 
literacy, functional literacy, cultural literacy, and critical literacy (Tozer, Violas & Senese, 2006). 
Conventional literacy has been described as the absolute basics, the ability to read and write. 
There is no connection, however, to greater comprehension. An example of this would be a child's 
ability to recognize or write his or her own name, but decoding a single word is not necessarily the 
same as reading comprehension. This is considered the lowest level of literacy. The highest level 
of literacy is critical literacy, founded on critical though. This type of literacy is the ability to use a 
greater source of experiences and knowledge to compare and critique writings. This requires, not 
only knowledge of one's culture, but knowledge of many cultures' values, beliefs, views and 
opinions. The ability to give greater meaning to what is read holds great power in societies. Power 
implies control and those who are illiterate have the ability to control economic and political 
oppression. With critical literacy the readers are empowered and are able to escape these racial, 
ethnic, gender or social discriminations. (Tozer et al. 2006). 

Literacy, therefore, plays a key role in the balance of power, which is why it is so highly 
valued in the United States, a nation built on democracy. Without first content and knowledge, 
how can individuals participate in critical thought, reading and writing on topics such as global 
climate change, ecosystem destruction or air quality? These are important issues we, as a society, 
are facing today . More attention has slowly been drawn to this topic , which influenced the birth 
and establishment of science literacy and environmental literacy. 

2.3 Definitions of Science Literacy 

In western culture there has been great emphasis placed on the importance of scientific 
literacy. Science and its technology have given us national security, medicine, clean water and air, 



10 



the ability to explore the universe and so much more. It is no wonder that we aspire to raise up a 

generation of scientifically literate individuals who understand a scientific method, can think 

critically about evidence based research and who feel prepared, knowledgeable and confident 

when facing scientific dilemmas. 

Scientific literacy means that a person can ask, find, or determine answers to 
questions derived from curiosity about everyday experiences. It means that a 
person has the ability to describe, explain, and predict natural phenomena. 
Scientific literacy entails being able to read with understanding articles about 
science in the popular press and to engage in social conversation about the 
validity of the conclusions. Scientific literacy implies that a person can identify 
scientific issues underlying national and local decisions and express positions 
that are scientifically and technologically informed. A literate citizen should be 
able to evaluate the quality of scientific information on the basis of its source 
and the methods used to generate it. Scientific literacy also implies the capacity 
to pose and evaluate arguments based on evidence and to apply conclusions 
from such arguments appropriately. (National Research Council, 1996, p. 22) 

The above definition can stand the tests of time, however, it is sometimes more 

meaningful to use examples that individuals can put into present day context. For this reason, 

Hazen and Trefic's (1991) definition is also one of importance. They describe scientific literacy as 

the following. 

The knowledge you need to understand public issues. It is a mix of facts, 
vocabulary, concepts, history and philosophy. It is not the specialized stuff of 
the experts, but the more general, less precise knowledge used in political 
discourse. If you can understand the news of the day as it relates to science, if 
you can take articles with headlines about genetic engineering and the ozone 
hole and put them in a meaningful context. . . you are scientifically literate. 

James Trefil (2008) would agree that this should be the goal of science literacy, not to make every 

person an expert scientist, but for an alternate goal, that every individual be able to read a 

newspaper the day they graduate from high school. Unfortunately, the science educational system 

does not have this as their aspiration and the number of citizens who are considered scientifically 

literate in the United States is low. It has only increased from 10 percent in 1988 to 28 percent in 

2010 (Miller, 1989; Miller, 2011). 



11 



Scientifically literate individuals continually ask questions and seek answers. It is 
inevitable that one day they will ask questions about their environment and contemplate whether 
their actions are affecting the global balance of life. Questions of sustainability , earth and 
atmospheric systems, energy, natural resources, as well as human and environmental interactions 
fall under a more specific category. Those who use science to answer environmental questions and 
then alter their actions to echo the scientific demands for stability are considered not just 
scientifically literate, but also environmentally literacy. 

2.4 Definitions of Environmental Literacy 

Stephen Schneider (1997), from Stanford University, stated that the objective for an 
environmentally literate society is not the unattainable goal of detailed knowledge of content. He 
thought it absurd to require citizens be knowledgeable in all environmentally relevant disciplines. 
There is much truth in this statement. It is ridiculous to expect a layperson to obtain and utilize the 
knowledge of an expert. This does not mean that an environmentally literate citizen lacks the core 
concepts, methods and skills of environmental science. The values an individual holds and the 
action he or she takes is an outward display of understanding these core concepts. 

Defining environmental literacy has proven difficult over the past 50 years. It is not only 
the ability to read and write about the environment, but an intimate connection with the 
environment that influences our actions and affect our conscious and subconscious behaviors. 

Disinger and Roth (1992) describe environmental literacy as the ability to perceive and 
interpret the health of an environmental system and then to take actions to improve, restore or 
maintain those systems. They believe environmental literacy is reflected in observable behaviors 
and actions, not just the opinions of an individual. 

An environmentally literate person knows that, as a consumer, they affect the 
environment. They acknowledge that his or her choices as a consumer either help or harm the 



12 



environment and that what they do as an individual or with their community can inhibit or aid the 
Earth in sustaining biological life (see, for example, Erickson 1997, Goleman 2009, McKibben 
2007, Payne 2010). Richard Wilke and Harold Hungerford encouraged citizens to become 
environmentally knowledgeable and "above all, skilled and dedicated citizens who are willing to 
work individually and collectively for achieving and/or maintaining a dynamic equilibrium 
between quality of life and quality of the environment." (Wilke, 1996, p. 15) Those who are 
considered environmentally literate will make decisions as a consumer and involved citizen to 
keep ecosystems healthy. In return they will create a high quality of life for themselves and future 
generations. 

Just as literacy is divided into four categories, environmental literacy can also be 
categorized along its continuum. Roth (1992) describes in his book, Environmental literacy: Its 
roots, evolution and directions in the 1990s, three degrees of environmental literacy. The first is 
Nominal environmental literacy, which is the lowest literacy of the three. It includes a rudimentary 
sensitivity for environmental issues, an acknowledgement of human environment interactions and 
a basic understanding of natural systems. The second is the Functional environmental literacy. 
This goes beyond the basic knowledge of human-environment interactions into an understanding 
of positive and negative affects. There is now a sense of concern for the environment based on the 
knowledge of human harm and destruction to the environment. An individual may even begin to 
develop new skills in which to analyze and assess information. They will begin to express desire 
for personal, as well as local or global, change and action. Operational environmental literacy is 
the highest environmental literacy. This is when a deep knowledge of ecological and 
environmental concepts bring about, not only understanding, but also are valued enough to impact 
their actions. This environmentally literate individual expresses a strong union between their 
values, beliefs and actions. They are constantly reading, writing and critiquing environmental 
literature and information. They have a strong connection with the environment and feel a 



13 



responsibility to ensure its protection and stability. Action is not only taken on a personal level, 

but they encourage action in their community and on a global scale. 

The most contemporary definition of environmental literacy was released in the 201 1 

NAAEE document, Developing a Framework for Assessing Environmental Literacy, which stated, 

Environmental literacy is knowledge of environmental concepts and issues; the 
attitudinal dispositions, motivation, cognitive abilities, and skills, and the 
confidence and appropriate behaviors to apply such knowledge in order to make 
effective decisions in a range of environmental contexts. Individuals 
demonstrating degrees of environmental literacy are willing to act on goals that 
improve the well-being of other individuals, societies, and the global 
environment, and are able to participate in civic life (Hollweg et al., 201 1). 

Using this clear definition of environmental literacy as well as the Colorado Academic Standard's 

outline of critical concepts and skills students are expected to master in K-12 (see Table 2.1 in 

Appendix), environmental literacy can be measured and assessed. It is important to measure such 

academic knowledge because of its significant implications. Environmental literacy must be 

achieved to overcome current, and prevent future, environmental crises. 

2.5 Current demand for EE and EL 

Coyle (2005) has shown that only 1% or 2% of Americans are considered 
environmentally literate. Working with National Environmental Education & Training Foundation 
(NEETF) he created the Environmental Literacy in America assessment tool in 1997. The 
NEETF/Roper Survey of Environmental Knowledge was a test, with only a dozen questions, used 
to assess an average American adult's knowledge on topics such as watersheds, recycling, 
electricity and other environmentally relevant topics. The survey was given and results compiled 
from 1997-2005. The results show that only one third of American adults can pass the survey with 
a grade of A, B or C. However, 95% of American adults (96% of parents) think environmental 
education should be taught in schools, which indicates that although they do not themselves have 
the knowledge necessary to be environmentally literate they do see a need for it (Coyle 2005). 



14 



A total of 301 respondents completed a survey, the Colorado Alliance for Environmental 
Education and Colorado Environmental Literacy Plan (CAEE CELP), in thirty-three Colorado 
counties represented in Figure 2.1 (see Appendix). There were 60 respondents who identified as 
either a parent or a guardian of a child in K-12. When the parents were asked which topics they 
want teachers to cover in greater depth, the top responses, with 7 1 .7% of the vote, were 
environmental systems, environment and economy, current environmental issues and personal and 
civic responsibility. When teachers were asked what the greatest barriers were to teaching EE in 
the class the top answer, with 22.1% of the vote, was that there is not enough time to incorporate 
EE. At the college level, over 22 staff, administrators and faculty from at least 7 universities or 
colleges responded to the survey from departments including: science, education, natural 
resources, environmental studies, museum studies, business and architecture. The survey showed 
that 23.5% implement EE in their classrooms every day, compared to only 5.9% of teachers K-12 
("Colorado environmental literacy," 2010). 

Although many parents and teachers would like environmental education in the 
classroom, they are finding it difficult to implement because of State and National restraints. The 
No Child Left Behind Act of 2001 (NCLB) was an educational reform enacted to increase 
academic accountability nationally. This new law placed great emphasis on state-defined 
educational standards and benchmarks, with great importance placed on reading and math scores. 
A school that does not meet its state's "adequate yearly progress," (AYP) two years in a row, is 
considered "in need of improvement" (Tozer, 2006, p. 463). The AYP's have led to States firing 
teachers and closing schools. This places teachers in a difficult predicament. They are now forced 
to focus their instruction exclusively on topics covered in the state assessments. Many schools and 
teachers are obligated to abandon environmental education programs to invest more time and 
money in math and language arts. When time is spent on topics outside test-related instruction, 
this is considered discordant and precarious. 



15 



This system has been built on coercive power, one that instills fear in the educators that 
either something bad will happen to them or something good will be taken away from them if they 
do not comply. As with all coercive power, commitment is superficial and energies have quickly 
turned to sabotage and destruction (Covey, 1991). Educators are not satisfied with the current 
system and are waiting for a bright new solution, one that values their skills as educators and 
places less emphasis on standardized tests. In spite of the current situation, many states have 
decided to pursue frameworks for environmental literacy. 

There is no shortage of prospective environmental literacy plans in the United States. 
Currently, 46 states are working on environmental literacy plans (ELP), four states have passed 
legislation for the creation of ELPs (DC, NJ, OR, CO) and two states that have completed their 
plans (MD, OR) (Navin, 2010). The No Child Left Inside (NCLI) Act is a piece of federal 
legislation that hopes to develop environmental education statewide. They aim at providing 
specialized development opportunities in environmental education. The legislation cannot move 
forward, however, unless there is an environmental literacy plan to access funds. In 2008 the 
NCLI was passed in the House with significant support. It was re-introduced into both the House 
and Senate in 2009 and is currently in committee ("NCLI," 201 1). The environmental literacy plan 
that the NCLI is focusing on has been created by the Colorado Alliance for Environmental 
Education (CAEE). There are 6 major requirements for these environmental literacy plans that the 
CAEE has outlined: 

1 . State content standards and how they relate to environmental literacy 

2. Programs for the professional development of teachers 

3 . How the state will measure the environmental literacy of students 

4. The relationship of the Plan to state graduation requirements 

5 . How the Plan will be implemented 



16 



6. Peer review of the Plan by major stakeholders, including State and federal agencies, 

non-profits, and other groups (CAEE, 201 1). 
This research focuses primarily on the first of these six requirements, state content standards and 
how they relate to environmental literacy. This research does, however, have implications for 
numbers two and three as well. 

Rather than restrict measurement to the standardized tests or assessments as NCLB did, a 
combination of approaches can be used to measure students' EL. Until Colorado has completed 
their ELP, we must rely on existing content standards to implement EE into the curriculum. The 
Department of Education has incorporated human environment interaction and ecological 
knowledge into the content areas of science and social studies. This research merely assesses one 
part of EL, basic environmental science knowledge acquisition, which is most accurately 
measured using a multiple-choice survey. The full measure of EL includes more than just content 
knowledge. It is not suggested that multiple-choice assessments be used to measure the other areas 
under examination in environmental literacy, such as attitude. This latent construct must be 
inferred from overt responses rather than measured directly (Milfont, 2010). 

2.6 Measuring EL with State Standards 

Academic standards were created to ensure that all school students would receive a high 
quality and consistent public education. Although the government does have great influence, 
education is not completely nationalized or global. In fact, each state in the US has its own process 
for developing, adopting, and implementing standards. The standards based education measures 
each individual student against a set of standards, as opposed to norm referenced education 
measures that evaluate students against their peers. This system emphasizes the use of criterion- 
referenced assessments. These educational assessments were created to make an official valuation 



17 



of academic attitudes, skills and knowledge in a specific content area. For this research, the 
content area of interest is science. 

State agencies do not currently measure the environmental literacy of students. Colorado 
K-12 content standards for science include Physical Science, Life Science and Earth Systems 
Science. The purpose of the science standards is to ensure the readiness of our students when 
released into a world that embodies 21 st century skills and technology. It is vital our K-12 
educational system encourages skills in research and technology, as well as a sense of care for, not 
only humans, but for the flora and fauna which surround them. The members of the Colorado 
Department of Education (CDE), who compiled the standards, have emphasized that more than 
anything their desire is to give Colorado students the ability to continually interpret evidence. 
Especially in this day and age when, "pseudo-scientific ideas and outright fraud are becoming 
more common place. Developing the skepticism and critical thinking skills of science gives 
students vital skills needed to make informed decisions about their health, the environment, and 
other scientific issues facing society" ("Colorado academic standards," 2009, p. 7). The CDE want 
to provide students with the tools necessary to decipher true science from pseudoscience. Science 
is often separated from value-laden politics, ethics and economics, however, in order to cease the 
destruction of the planet, there must be an intersection to promote personal responsibility. This 
intersection cannot affect the logic, methods, rationality or results of science, but rather affect the 
actions we take in response to its enlightenment. 

Some of the most pertinent issues our children will (unquestionably) face are those of the 
environment. Climate, water and air pollution, ecology, biodiversity, sustainable agriculture, toxic 
waste management, limited natural resources, sustainable economic development, these are the 
core issues that, not only our future scientists, but also future citizens will face. It is important that 
individuals are able to articulate their environmental concerns, ideologies and critical rhetoric. 
With these issues in mind, the Department of Education began their revision of the existing 



18 



Colorado Standards, Colorado Student Assessment Program (CSAP) tests, which have been in use 
the past fourteen years. During the transition into the new standards, Colorado school districts will 
be using what are called the Transitional Colorado Assessment Program (TCAP) though 2013 
until the old standards are completely phased out. By 2014 school districts in Colorado should 
have completed implementing the new tests ("CSAP / TCAP," 201 1). 

The new Science Standards were divided into three sections based on topical 
organization. The three standards of science are: 

1 . Physical Science- Students know and understand common properties, forms, and changes in 
matter and energy. 

2. Life Science- Students know and understand the characteristics and structure of living things, 
the processes of life, and how living things interact with each other and their environment. 

3. Earth Systems Science- Students know and understand the processes and interactions of Earth's 
systems and the structure and dynamics of Earth and other objects in space ("Colorado academic 
standards," 2009). 

Each standard is broken down by high school and grade level expectations, and these are 
further broken down into concepts and skills students should master. There has recently been a 
push to either add a fourth standard, an environmental science standard, or to encourage more 
environmental education within traditional subjects, such as science and social studies. Adding a 
fourth standard is not necessarily the best option for Colorado because of K-12 time restraints in 
the classroom. The department of Education has found that it is a better option to integrate EE into 
current classroom instruction. Using these new standards, imbedded with environmental concepts, 
students' environmental knowledge can be evaluated using an instrument that combines 
assessment from the American Association for the Advancement of Science as well as contexts 
from PISA's globally accepted environmental literacy framework (Project 2061, 1993; Hollweg et 
al.,2011). 



19 



3. Methods 

3.1 Introduction to assessment 

When students graduate from high school and continue along their path into adulthood, it 
is important that they have been given every tool necessary to move forward into college or career. 
It is also vital that they become a knowledgeable, positive and participating member of society. It 
is the responsibility of the Department of Education, teachers, parents and society to grow 
environmentally literate individuals. Currently, there are not any state assessments testing 
environmental literacy that are directly related to state academic science standards (see Table 3.1 
in Appendix). At the college and university level of education it is difficult to quantify each 
student's understanding of the concepts learned under the Colorado science standards. 

The Introduction to Environmental Science Course at the University of Colorado Denver 
(UCD) is filled with students from diverse backgrounds. Each semester there are roughly 200 non- 
science majors who sit through the course. They do not necessarily enter the course because they 
are interested in Environmental Science. UCD requires that all graduating students take at least 
one course with a lab. Many students pick Intro to Environmental Science because it fulfills this 
requirement, (see Figure 3.2 in Appendix) 

What this means to the professor teaching the course is that there are students from many 
different disciplines signing up for the class. Since it is an introductory course, the only 
prerequisite is the completion of the Science Standards in K- 12. It is important that key concepts 
learned in High School, Middle School and even Grade School are carried through to the 
undergraduate level. Although students come from all across the state, country and even world to 
attend UCD, 70% of students who sign up for this introductory course have attended K-12 in 
Colorado. These students should, theoretically, understand key concepts in Environmental 
Science (ES) and be able to pass an assessment of their environmental knowledge. Although ES 



20 



has only recently been incorporated into the standards, this does not imply that older students are 
any less environmentally literate than their younger peers. Environmental knowledge can come 
from sources outside of education, such as family, media, peers and personal experience. The 
purpose of this assessment instrument, Assessing the Environmental Literacy of Intro 
Environmental Science Students (AELIESS), is to gather information about a diverse group of 
students' environmental knowledge (see Figure 3.3 in Appendix). A quality learning experience is 
designed with the students in mind. Student-centered course design takes into account the 
students' knowledge, learning styles and needs. Instead of simply transmitting a body of 
environmental knowledge to the students, the educator uses active learning such as critical 
thinking and problem solving. With the use of AELIESS, the educator limits the assumptions he or 
she makes about the students' environmental knowledge and literacy. AELIESS gives educators 
some baseline data, a starting point from which the course curriculum can be built. It also gives 
freedom from repetition of concepts if students are already knowledgeable in certain areas. Most 
importantly, it aids in the ultimate goal of the course: moving students from a nominal to an 
operational environmental literacy . 

3.2 Creation of AELIESS 

When creating the new science standards, the Colorado Department of Education 
committee used a variety of resources, including: Science for all Americans (Rutherford, 1990), 
Benchmarks for Science Literacy (Project 2061, 1993), and The Atlas for Science Literacy 
(AAAS, 2001a). By relying on the Colorado Department of Education as a resource to create the 
AELIESS instrument there is less subjectivity and higher validity concerning the content of items. 
Eight of the 16 multiple-choice questions were taken directly from the American Association for 
the Advancement of Science (AAAS) website. Each of the AAAS questions was chosen from key 
ideas within the science standards concepts. The AAAS Science Assessment was established 



21 



under Project 2061 and a website was created for public access. For each science topic, including 
Physical Science, Earth Science, Life Science and the Nature of Science, the website has a list of 
sub-ideas, a list of items, results from field testing, and a list of student misconceptions for each 
individual question. The other eight, non-AAAS, questions on the instrument were created using 
the new Science Standards as guidance, as well as the PISA Framework for Environmental 
Literacy (Hollweg et al., 2011). Although the questions were chosen from three different topics, 
or subdomains, the questions for the instrument all had an overarching environmental theme 
unifying them. Each question further identified with one or more specific 'contexts' in 
environmental science. These contexts included biodiversity, natural resources, environmental 
quality and health, natural hazards and extreme weather, and land use (see Table 3.4 in Appendix). 
The PISA Environmental Literacy Framework provided examples of each context, all of which 
(except population growth) were used in the development of test items on AELIESS (Hollweg et 
al., 201 1 , p.20). Population growth is considered a topic in the social studies standards; therefore 
the context was excluded from the assessment. Over 37% of AELIESS items included 
biodiversity, nearly 44% included natural resources, 25% included environmental quality and 
health, nearly 19% included natural hazards and extreme health and 12.5% included land use (see 
Table 3.3 in Appendix). 

3 .3 Identification of measure 
A. General Information 

The instrument is titled, Assessing the Environmental Literacy of Intro Environmental 
Science Students (AELIESS) . It has the ability to highlight topics and concepts a majority of the 
students may be struggling with. Areas the students have mastered can also be identified. By 
highlighting these problem areas the instructor can make the most of their time with the students 
and can focus on their actual needs, as apposed to their theoretical needs. This assessment 



22 



instrument could potentially be used by any introductory course in environmental science, 
however, the questions are based on Colorado Standards, thus this assessment is most effective 
when given to students who have attended, at least, grades 9-12 in Colorado. 

1 . Purpose(s) of measure 

Assess environmental literacy among students in Intro to Environmental Science. The 
assessment could potentially be given to K-12 students, post-secondary students, pre- and in- 
service teachers, or the general public. The purpose of the assessment is not to be used as an exit 
exam for high school graduates, although it could accurately measure their knowledge in 
environmental science. It is not my intention to create yet another obstacle standing between high 
school students and their future goals. Standardized exams are many times the unscrupulous 
gatekeeper of occupational and educational opportunity. The instrument, for this research purpose, 
is to be used by instructors or professors in higher education to assess the environmental literacy 
of their students. With this information they may quickly discover which topics h/she should 
spend the most time reviewing or building upon throughout the semester. The instrument is an 
excellent indicator of the students knowledge, however, more research needs to be done to make 
the connection between what students know, how they feel, and how they act. It is important to 
keep in mind that a student could score a 100% on the assessment and still make poor 
environmental decisions in their every day life. Qualitative research is encouraged to bridge the 
gap for complete environmental literacy assessment. 

2. Specific sub-domains assessed 

The instrument has an over arching theme examining the students understanding of core 
concepts in environmental science. The more questions an individual is able to answer correctly 
positively correlates to the individual's environmental literacy. Questions were chosen from 
content covered under sixth grade, eighth grade and high school standards, as lower grades' 
concepts were simplified versions of the higher grade levels. There are three different content 



23 



areas under the standards: Physical Science, Life Science and Earth Systems Science. Each 
content area is further divided into concepts and skills the students should master, (see Table 2.1 
in Appendix) The following represents the content areas and their concepts, which were used to 
create the AELIESS. Questions were selected based on their correlation to environmental 
concepts. Sixteen questions were created for the instrument for quantitative analysis. 

B . Intended test population 
l.Age 

The Instrument can be given to anyone age 19 or older, unless the individual graduated early from 
high school and is enrolled in a college level course, this is the exception. 
2. Special groups 

The instrument was not created for nor tested using individuals with disabilities or 
behavioral problems. 

C . Administration 

The instrument can be administered in individual or group settings. It is suggested that it 
is administered in a quiet room without distractions to maximize reliability. It is also suggested 
that the assessment is given the first day of class if given in a classroom setting. 

D. Time required 

The actual testing time is approximately 20 minutes. Total administration time is 
approximately 30 minutes, 5-10 of which is spent establishing rapport and giving oral instructions 
to the students. Any questions the students might have are answered before passing out the 
instrument. 

E. Stimulus items 

The respondent is given a form on which they fill out the demographic information, 
including their gender, age and ethnicity. The respondent is then asked if he or she graduated High 
School and must circle either yes or no. They are also asked how many years of K-12 they 



24 



attended in Colorado. Then the instructions ask them to read and complete 16 multiple-choice 
answers by circling one answer. Only one question, under the life science questions, has pictorial 
representation (a flow chart) to aid in completing the question. The 16 questions are used to 
quantify the respondent's understanding of basic environmental systems and concepts. This 
portion is all that is necessary to assess the students' knowledge of environmental literacy. 

The assessment could be given with a scantron so that the hard copies could be reused, 
saving time and resources. 

F. Administration Procedures 

After obtaining approval for human subjects research by the International Review Board, 
the instrument can be administered and scored by individuals without formal training in 
assessment. The instrument was created for Colorado educators in the Environmental Sciences, 
specifically at the College and University level. There are not multiple tests or sections thus there 
is not a specific sequence of actions for administering the measure. The first official 
administration of AEILESS was conducted in the Spring 2012, before classes had begun. In the 
future, a second assessment could be created assessing the respondent's actions, values and 
behaviors, in which case, the two instruments should be taken simultaneously and then scored to 
assess overall disposition towards the environment, as well as gaps between attitude and behavior. 

G. Scoring Procedures 

Interpretation of the instrument's scores requires graduate training in environmental 
science or related fields. To score the assessment, the numbers of correct answers are tallied, 
giving a raw score for each individual student, which are then compiled and averaged. This gives 
an idea of the overall performance of the class. The second step in scoring the assessment is to 
sum up the individuals' correct answers for each sub-domain (Physical Science, Life Science and 
Earth Systems Science), and then these are compiled and averaged. This gives an idea of the 
overall performance of the class within each sub-domain. By looking at the averages, medians and 



25 



modes within each domain, areas of difficulty can be identified. For this type of continuous scale, 
zero to 16, the measure of central tendency that is the most meaningful is the mean. Scoring of the 
multiple-choice section of the instrument could be done quickly and easily using scantrons. This is 
the most efficient way to score large groups of students efficiently and with as little human error 
as possible. 

H. Interpretation procedures 

Demographic information should be analyzed for trends and changes in the student population 
over time (for example, the average age of a population may increase or decrease from one 
semester to another, which could correlate to overall performance). Trends should also be 
analyzed for ethnicity. Total mean score for the population as well as for each of the sub domains 
(Physical Science, Life Science, Earth Systems Science) should be calculated and analyzed to 
reveal an overall level of understanding environmental concepts as well as reveal which, if any, of 
the three sub domains the students are struggling with. 

3 .4 Support for measure 
A. Item selection 

Each item on the assessment was put through a pilot test before the final instrument was 
completed. This 16-item MC question form was collected from students in two Environmental 
Science sections at UCD in the fall semester of 201 1 . There were not any individuals who 
identified themselves as having any special education needs. First, the statistical properties of 
individual items were examined in the combined sample. Items for which responses were 
frequently missing (i.e. Suggesting that such items were poorly worded, or frequently 
misunderstood) were eliminated. Using SPSS, each item score was correlated with the total score 
within each scale, and then items with the lowest item-total correlations were modified. Principal 
Component analysis was used as a second approach for clarifying scale structure and determining 



26 



the strength of scale membership for each item. Each of the analyses identified three predominant 
factors and one or two secondary factors that accounted for the majority of variance within a scale. 
These latter factors contained only a few items and accounted for minimal variance. 
B . Validity evidence 

Validity was based on the content of its items (content validity) and the internal structure 
of the instrument (discriminant validity) and whether the operationalizations of the construct 
actually measure Environmental Science and literacy (construct validity). Using excel, item 
analysis was conducted to determine internal consistency. This included assessing the difficultly 
of each AELIESS item, as well as the relationship between how well students did on the item and 
their total score. The item difficulty index ranges from to 1 , the higher the value the easier the 
question. If the item difficulty is 0.79, this means that 79% of the students answered the question 
correctly. The ideal difficulty for a four- response multiple-choice question is a moderate score of 
62%. Difficulty is measured on a scale classifying 85% or above as easy, 51 to 84% as moderate 
and 50% or below as hard. Comparing students' item responses to their total test scores assesses 
the quality of individual items. This test should discriminate between students who are 
environmentally knowledgeable and those who are not. The item has low discrimination if it is too 
difficult or too easy. Item discrimination, also called Point-Biserial correlation (PBS), is 
considered good if it is above .30 , fair if it is between 0.10 and .30 and poor if below 0.10. 

Construct validity was examined using Principal Component Analysis (PCA). Loadings 
in excess of .71 (50% overlapping variance) are considered excellent, 0.63 (40%) is very good, 
0.55 (30%) good and 0.45 (20%) fair, and 0.32 (10%) is considered poor. The items are expected 
to load primarily on one overarching component, Environmental Science, or on three components, 
Physical Science, Life Science and Earth Systems Science. The eigenvalues over one should 
account for most of the variance. 



27 



C. Reliability 

Internal consistency estimates the reliability of test scores using Cronbach's alpha. The 
scale, from to 1 , indicates the degree to which the set of items measure a single unidimensional 
latent construct. The construct for this research, unifying the items is Environmental Science. 
Higher values of alpha indicate higher intercorrelations among test items and thus increased 
reliability. A Cronbach's a > .9 is considered to have excellent internal consistency. Good internal 
consistency is .9 > a > .8, acceptable is .8 > a > .7, questionable is .7 > a > .6, poor is .6 > a > .5, 
and unacceptable is .5 > a. Running Cronbach's alpha on SPSS gives the Item-Total Statistics, 
which includes Cronbach's Alpha if an item is deleted. This gives the option of removing an item 
to significantly raise the internal consistency. 



28 



4 Results and Discussion 

Figure 4.1 Difficulty and Discrimination Distributions, illustrates the correlation of each AELIESS 
multiple-choice item to the total score (0=no correlation, l=perfect correlation) as well as the 
difficulty of the items (0= most difficult, 1= least difficult). 



Difficulty and Discrimination 
Distributions 



u 
'o 

U 

"S. 



16 

15 

14 

13 

12 

11 

10 

9 

8 

7 

6 

5 

4 

3 

2 

1 




0.08 



0.27 




0.51 



0.56 



0.49 



0.79 



0.21 



0.30 



0.46 



0.43 



0.30 



■to ° 5 

0.43 



0.67 



0.63 



0.44 



T 



0.37 



0.63 



0.62 



0.39 



0.78 



0.00 0.20 0.40 0.60 0.80 

Difficulty and Discrimination 



•Difficulty 



• Discriminant 



1.00 



29 



Figure 4.1, Difficulty and Discrimination Distributions, illustrates the difficulty of each 
item as well as its correlation to the overall score. All of the items, except numbers 14 (Earth 
Systems Science) and 1 1 (Life Science), were above 0.30 for difficulty. In order of decreasing 
difficultly, the items are: 14,8, 11,15,4,12,5, 16, 13,2,3,6,7,9, 1 and 10. Item number 14 was 
the most difficult with only eleven individuals out of 144 (8%) answering correctly. This item was 
the most difficult for students in the pilot test as well. After changing the wording, the difficultly 
was expected to decrease, but did not. The PBS for number 14 is 0.27, which is at the higher end 
of fair, indicating the eleven students who did answer this item correctly scored highly overall. 
There were only nine Environmental Science majors in the class and of these, four answered 
number 14 correctly. The fact that students did poorly on this question does indicate that students 
are either not familiar with balances between energy production and environmental impact, or they 
are not familiar with the newest forms of renewable energy. Many students are familiar with solar 
energy, which is why it was the number one incorrect response from all participants. It is 
important to identify common misconceptions so that they can be addressed. This is why the item 
was not removed from the test after the pilot study. 

The PBS also revealed that all of the correlations were above 0.20, which indicates high 
discriminant validity. Students who showed the highest comprehension of the concepts scored the 
highest overall, and got the most difficult items correct, whereas students who had lower test 
scores got the difficult items incorrect. Correlations of 0.40 or higher, showing the highest validity 
on the exam, were numbers 12, 10, 7, 13, 3, 5 and 6, which were primarily from the Life Science 
sub-domain. 



30 



Table 4.1 Total Variance Explained: displays eigenvalue loading on three items explaining 
33.73% variance as well as the seven components, loading higher than one, explaining 61% 
variance. 



Component 


Initial Eigenvalues 


Extraction Sums of Squared Loadings 


Total 


%of 
Variance 


Cumulative % 


Total 


%of 
Variance 


Cumulative % 


1 


2.388 


14.925 


14.925 


2.388 


14.925 


14.925 


2 


1.563 


9.771 


24.696 


1.563 


9.771 


24.696 


3 


1.445 


9.032 


33.728 


1.445 


9.032 


33.728 


4 


1.240 


7.752 


41.480 








5 


1.058 


6.612 


48.092 








6 


1.038 


6.486 


54.578 








7 


1.007 


6.293 


60.871 








8 


.945 


5.908 


66.779 








9 


.895 


5.595 


72.374 








10 


.827 


5.166 


77.540 








11 


.731 


4.569 


82.109 








12 


.694 


4.337 


86.446 








13 


.644 


4.024 


90.470 








14 


.551 


3.447 


93.917 








15 


.508 


3.173 


97.090 








16 


.466 


2.910 


100.000 








Extraction Method: Principal Component Analysis. 



31 



Table 4.2 Principal Component Analysis: displays loading on three primary components. Loading 
occurred primarily on the first component. 



Component Matrix 3 




Component 


1 


2 


3 


1 


.387 


-.434 


.175 


2 


.315 


.018 


.255 


3 


.505 


.050 


-.396 


4 


.249 


.139 


-.558 


5 


.390 


.235 


-.325 


6 


.482 


.174 


.028 


7 


.478 


-.027 


.056 


8 


.188 


.515 


.160 


9 


.396 


-.326 


-.146 


10 


.582 


-.372 


-.086 


11 


.093 


-.182 


.531 


12 


.450 


.422 


.311 


13 


.607 


-.129 


-.014 


14 


.008 


.596 


-.129 


15 


.160 


.408 


.182 


16 


.272 


.085 


.538 


Extraction Method: Principal Component 
Analysis. 


a. 3 components extracted. 


Primary loading 


Secondary loading 



Using Principle Component Analysis (PCA) on the results, seven eigenvalues were 
identified larger than 1 , accounting for 6 1 % of the variance . The items could have loaded 
according to their contexts (see Table 3.3), however, the greatest loading were on three principle 



32 



components Identified as Physical Science, Life Science and Earth Systems Science (see Tables 
4.1 and 4.2). There were meaningful correlations, of .32 or larger, between the items and the 
components they loaded on. The greater the loading, the more that variable is a pure measure of 
the factor. 

There was not loading greater than 0.61 on any one component. A majority of the 
questions loaded on component one. High loading on only one component was expected, with a 
unifying theme of Environmental Science. If the questions had loaded atypically, this would 
suggest that the questions selected for the study were not environmentally founded. Factor 
analysis was also used to identify the difficultly of each item on the instrument, and also to 
compare how well the students' performance on an item correlated to their overall score. This 
provided greater clarity when attempting to interpret the factors and understand the underlying 
dimension that unified the groups of variables loading on it. 

Cronbach's Alpha (Tables 4.3-4.5) 



Table 4.3 Case processing summary: presents sample size and the percent valid and excluded 
cases. 





N 


% 


Cases 


Valid 


144 


88.3 


Excluded 3 


19 


11.7 


Total 


163 


100.0 


a. Listwise deletion based on all variables in the 
procedure. 



Table 4.4 Reliability: which is a measure of the assessment's precision in scoring environmental 
science. 



Cronbach's Alpha 


N of Items 


.602 


16 



33 



Table 4.5 Item-Total Statistics: includes descriptives for each item, including which items should 
be deleted to increase internal consistency . 





Scale Mean if 
Item Deleted 


Scale Variance 
if Item Deleted 


Corrected Item- 
Total 
Correlation 


Squared 

Multiple 

Correlation 


Cronbach's 

Alpha if Item 

Deleted 


1 


7.566434 


7.029 


.253 


.196 


.582 


2 


7.727273 


6.968 


.215 


.154 


.588 


3 


7.720280 


6.803 


.284 


.227 


.576 


4 


7.916084 


7.182 


.122 


.175 


.604 


5 


7.825175 


6.788 


.275 


.216 


.577 


6 


7.720280 


6.817 


.278 


.201 


.577 


7 


7.678322 


6.778 


.308 


.188 


.572 


8 


8.139860 


7.225 


.165 


.105 


.595 


9 


7.629371 


7.114 


.182 


.086 


.593 


10 


7.559441 


6.806 


.367 


.282 


.565 


11 


8.034965 


7.432 


.041 


.117 


.615 


12 


7.839161 


6.583 


.358 


.238 


.562 


13 


7.790210 


6.711 


.309 


.197 


.571 


14 


8.272727 


7.429 


.173 


.116 


.594 


15 


8.000000 


7.133 


.154 


.125 


.598 


16 


7.825175 


7.109 


.148 


.142 


.600 



The tables above (Tables 4.3-4.5) include the Case Processing Summary, Reliability and 
Item-Total Statistics for Cronbach's alpha. The alpha value for the AELIESS assessment, using all 
16 multiple-choice items, was .602 (Table 4.4). Higher internal consistency could be achieved if 
additional items were added to the sub-domains, Physical Science and Earth Systems Science, 
which only contained three questions each. Table 4.5 reveals how Cronbach's alpha would be 
affected if an item were deleted. As you can see, deleting any of the 16 items would not greatly 
improve the reliability. 



34 



Table 4.6 Demographic information including percents of represented ethnicities. 



Caucation 


Hispanic 


Asian 


African 
American 


Middle 
Eastern 


Other 


Non-response 


50% 


12.5% 


4.86% 


4.17% 


3.47% 


11.11% 


13.89% 



The sample size included 70 males, 59 females and 15 without a response resulting in a 
total sample size of 144, a mode age of 19 and a mean age of 22. Half of the population identified 
as having Caucasian ethnicity, whilst half of the population identified as either Hispanic, Asian, 
African American, African, Middle Eastern, Italian, German, Australian, Korean, Native 
American, Other or did not respond to the question at all. This was considered an ethnically 
diverse sample, with many different ethnicities, however, because sample sizes were small for 
ethnicities other than Caucasian, this inhibited examining the students scores with a t-test or 
ANOVA (as many ethnic groups had less than 3 members). Several t-tests and an ANOVA were 
run to determine if other demographic data (gender, age, K-12 attendance, high school graduate) 
affected how well individuals performed on the environmental assessment. 



35 



Table 4.7 Independent t-test between men and women's scores 





Levene's Test 
for Equality 
of Variances 


t-test for Equality of Means 


F 


Sig. 


t 


Sig. (2-tailed) 


Mean 

Differen 

ce 


Std. Error 
Difference 


95% Confidence 
Interval 


Lower 


Upper 


Equal 

variances 

assumed 


1.123 


0.291 


1.167 


0.245 


0.56 


0.48 


-0.389 


1.509 


Equal 
variances 
not 
assumed 






1.177 


0.241 


0.56 


0.476 


-0.382 


1.502 



Table 4.8 Group statistics for men and women. 





Male=l 


N 


Mean 


Std. Deviation 


Std. Error Mean 


Sum 


1 


70 


8.81 


2.83 


0.338 





59 


8.26 


2.57 


0.335 



A t-test was run to see if there was a difference in scores between men and women. A p- 
value of 0.245 > 0.05 indicates that there is not a significant difference in scores (see Table 4.7). 
Sample sizes were very close for the two populations, as well as the mean scores, which for men 
was 8.81 and for women 8.257 (Table 4.8). This indicates that individuals environmental literacy 
is low, regardless of gender. If there had been more than two groups (men and women) for the 
factor (gender) an ANOVA could have revealed differences in scores within and between the sub- 
domains. 



36 



Table 4.9 Independent t-test between high school graduates and non-graduates. 





Levene's Test 

for Equality of 

Variances 


t-test for Equality of Means 


F 


Sig. 


t 


Sig. (2- 
tailed) 


Mean 
Difference 


Std. Error 
Difference 


95% Confidence 
Interval 


Lower 


Upper 


Equal 

variances 

assumed 


0.588 


0.445 


0.476 


0.635 


-0.9 


1.892 


-4.647 


2.847 


Equal 

variances not 
assumed 






0.592 


0.656 


-0.9 


1.519 


-18.055 


16.255 



Table 4.10 Group statistics for high school graduates and non-graduates. 





Graduate=l 


N 


Mean 


Std. Deviation 


Std. Error Mean 


sum 


1 


120 


8.6 


2.658 


0.243 





2 


9.5 


2.121 


1.5 



Another question was whether those who graduated from high school had a better grasp 
of environmental concepts. Table 4.9 shows a p-value of 0.635 > 0.05, which indicates that there 
is not a significant difference in scores. Both groups of students have similar environmental 
knowledge, although Table 4.10 shows that the mean score for those who did not graduate high 
school was 9.5 and for graduates was only 8.6. The sample size for the non-graduates was only 
two individuals, vs. 120 in the graduates' population. These two students could have received their 
GED's or could have been home schooled. Given a larger sample size with a larger population of 
non-graduates, this statistic could significantly change. 



37 



Table 4.11 Independent t-test for K-12 Colorado between those who attended Kindergarten 
through 12 th grade in Colorado and those who did not. 





Levene's Test 

for Equality of 

Variances 


t-test for Equality of Means 


F 


Sig. 


t 


Sig. (2- 
tailed) 


Mean 
Difference 


Std. Error 
Difference 


95% Confidence 
Interval 


Lower 


Upper 


Equal 

variances 

assumed 


0.125 


0.724 


0.203 


0.839 


-0.104 


0.513 


-1.118 


0.91 


Equal 

variances not 
assumed 






0.207 


0.836 


-0.104 


0.503 


-1.105 


0.897 



Table 4.12 Group statistics for K-12 Colorado and non-Colorado attendees 





K-12yes=l 


N 


Mean 


Std. Deviation 


Std. Error Mean 


sum 


1 


90 


8.51 


2.712 


.286 





39 


8.62 


2.581 


.413 



The most surprising of the independent t-tests was between those who attended 
Kindergarten through 12 th grade in Colorado and those who did not. The AELIESS assessment 
was specific to Colorado environmental knowledge in terms of the Colorado content standards that 
were used to construct the questions as well as the nature/specificity of the questions themselves. 
For example, item 13 specifically addresses available, renewable energy in Colorado. One could 
assume that those who attended school in Colorado would perform better on the question. Table 
4.1 1 reveals a p-value of 0.839> 0.05, indicating that there is not a significant difference in scores 
between those who attended Kindergarten through 12 th grade in Colorado and those who did not. 
The sample size was 90 for Colorado attendees and 39 for non-Colorado K-12 attendees, and the 
mean scores were 8.51 compared to 8.62 (see Table 4.12). Had the assessment contained more 
Colorado specific questions, the statistical difference could have been significant. Question 13 was 
considered one of the best questions on the assessment, with high internal validity (see Figure 



38 



4.1). An important aspect of environmental literacy is that students are aware of, not just global, 
but local means for solving environmental problems and achieving change. 

Table 4.13 Descriptives on a One-factor ANOVA for Age and Average scores. 





N 


Mean 


Std. Deviation 


Std. Error 


95% Confidence Interval for 
Mean 


Min. 


Max. 


Lower 
Bound 


Upper Bound 


18 


11 


6.82 


1.601 


.483 


5.74 


7.89 


4 


10 


19 


27 


8.37 


2.817 


.542 


7.26 


9.48 


4 


15 


20 


23 


7.91 


2.539 


.529 


6.82 


9.01 


3 


12 


21 


20 


7.70 


3.729 


.834 


5.95 


9.45 





14 


22 


16 


8.81 


2.228 


.557 


7.63 


10.00 


5 


13 


23 


11 


7.91 


2.256 


.680 


6.39 


9.42 


4 


11 


24 


6 


8.83 


2.927 


1.195 


5.76 


11.90 


4 


12 


25 


3 


8.33 


2.082 


1.202 


3.16 


13.50 


6 


10 


26 


4 


9.25 


1.258 


.629 


7.25 


11.25 


8 


11 


27 


3 


11.67 


2.517 


1.453 


5.42 


17.92 


9 


14 


28 


6 


9.33 


1.966 


.803 


7.27 


11.40 


6 


11 


29 


2 


13.00 


.000 


.000 


13.00 


13.00 


13 


13 


30 


2 


8.50 


4.950 


3.500 


-35.97 


52.97 


5 


12 


31 


1 


10.00 










10 


10 


32 


1 


6.00 










6 


6 


33 


1 


10.00 










10 


10 


34 


1 


13.00 










13 


13 


39 


2 


11.00 


2.828 


2.000 


-14.41 


36.41 


9 


13 


Total 


140 


8.39 


2.763 


.234 


7.93 


8.85 





15 



39 



Table 4.14 One-way ANOVA for Age and Average scores. 





Sum of Squares 


df 


Mean Square 


F 


Sig. 


Between Groups 


177.338 


17 


10.432 


1.440 


.130 


Within Groups 


884.055 


122 


7.246 






Total 


1061.393 


139 









Mean Scores for Age Groups 



■ 
















34 '_ 




















10 
10 


13 
11.67 


32 






■ f\ 










30 


















as 
< 28 










33 






§ 26 

•a 






25 






s 

& 24 


o oo 




7.91 


22 






7.7 
7.91 


20 " 






18 






m-kMi 



6 8 10 

Mean Score out of 16 



12 



14 



16 



Figure 4.2 Graphical representations of mean scores for multiple-choice questions (1-16) for each 
age group (18-34 and 39). 



Table 4.13 provides descriptives on a one-factor ANOVA for Age. The ages range from 
18 to 39. The mean and range of scores for each age group are given, as well as the sample size of 



40 



each group. It is interesting to note that the age group '21' had the highest score of 14, as well as 
the lowest score of zero. Table 4.14 gives a p-value of 0.130 > 0.05, indicating that there is not a 
significant difference between age groups and average score. Significance within groups could not 
be tested because some age groups had less than 2 individuals representing that group. For graphic 
representation of mean scores for age groups, see Figure 4.2. Visually, it appears that older 
students tend to have higher mean scores. This pattern is supported by data analysis in Table 4.15. 
Although every group from 18-25 contained at least one individual who scored < 6, you can see 
that the individual 32 years of age scored a 6, graphically making that age group appear the most 
environmentally illiterate group. Figure 4.2 could be misleading, which is why it must be 
examined alongside Table 4.13. 



Table 4.15 Independent t-test between individuals 18 to 20 years old and those 21 to 39 years olds, 
reveals a significant difference, p-value 0.006 < 0.05. 





Levene's Test 

for Equality of 

Variances 


t-test for Equality of Means 


F 


Sig. 


t 


Sig. (2- 
tailed) 


Mean 
Difference 


Std. Error 
Difference 


95% Confidence 
Interval 


Lower 


Upper 


Equal var. 
assumed 


1.645 


0.202 


2.766 


0.006 


-6.937 


0.459 


-2.176 


-3.62 


Equal var. not 
assumed 






2.834 


0.005 


-1.269 


0.448 


-2.155 


-0.383 



Table 4.15 and Figure 4.2 both seemed to indicate a slight increase in score with age. To 
test this trend, the sample size was split, with one group representing 18 to 20 years of age and the 
other group 21 to 39 years of age. Table 4.15 Independent t-test between individuals 18 to 20 
years old and those 21 to 39 years old, shows a p-value 0.006 < 0.05, indicating a significant 
difference in scores between the two groups. 



41 



Table 4.16 Group statistics for ages 18 to 21 and 21-39. 





Under 21=1 


N 


Mean 


Std. Deviation 


Std. Error Mean 


Sum 


1 


81 


7.86 


2.867 


0.319 





60 


9.13 


2.439 


0.315 



The mean score for those under 21 was 7.86 and for those 21 and older 9.13 (see Table 
4.16). It is not clear why individuals in the older group would perform significantly better than 
their younger peers. One plausible explanation is that these students have taken more college level 
courses, any of which could have been related to environmental science. It could also be that they 
are "academically-savvy" and likely to look outside of academia for environmental knowledge and 
education, an idea discussed further on in the reading. 



0) 

■- 

o 

u 

9 

0) 



E 



Sub-domain scores compared to total 
mean score 

i i i 



. 



37.76 Earth Sys. 



53.38 Life 



3 



sub-domain 

total mean score 
52.18 % 
67.83 Physical 



10 20 30 40 50 60 70 80 

Score (%) 



Figure 4.3 A comparison of the participants mean scores in the three sub-domains to the total 
mean score. 



Figure 4.3 illustrates an overall performance of the population by comparing the sub- 
domain scores to the total mean score. The total mean score for the class was 52.18%, which 



42 



shows that the class as a whole does not have a strong foundation in environmental science nor 
high levels of environmental literacy and need further assistance in one or more of the three sub 
domains. Statistical analysis showed that on average the students scored a 67.8% in Physical 
Science, 53.4% in Life Science, and 37.8% in Earth Systems Science. The obvious area of concern 
for this population of Intro Environmental Science students is in Earth Systems Science. If we 
view figure 4.3 alongside Figure 4.1 and Table 3.3 (in Appendix), a few observations can be 
made. The most difficult questions for the students came from Life Science, items 8 and 1 1 , as 
well as from Earth Systems Science, items 14 and 15 (see Figure 4.1). In Table 3.3, Distributions 
of contexts, these items fall most heavily under biodiversity and environmental quality and health. 
These are topics the instructor should allocate greater time for review. 

The assessment could show that the students have a firm grasp on the foundational 
concepts learned in high school. In this case, the structure of Intro to Envs course could 
incorporate a more qualitative structure, increasing the students' connection with the environment 
through reading and research on topics of interest, weekly field exercises and research papers in 
oral and written form. The poor results illustrated in Figure 4.3 were not surprising. There has 
been an obvious lack of emphasis placed on environmental knowledge in the world of academia. 
Until recently, educators and policy makers have not seen the need for developing an 
environmentally literate youth. Transitioning environmental science into K-12 standards will be 
difficult for many educators. There is global concern as to whether teachers have the necessary 
basic knowledge of environmental concepts to teach students (Loubser, 2001). This could be why 
students in this sample have performed so poorly on AELIESS. The use of AELIESS could, 
therefore, be extended to K-12 teachers, to highlight gaps in their knowledge. It should not be used 
to reprimand or punish teachers. After all, it is not the educators' fault they were not required to 
take an environmental science course before receiving licensure. The main use of the assessment 
is to provide post-secondary educators and teacher development programs with a tool to assess 



43 



their students' environmental knowledge to work more proficiently towards environmental 
literacy . 



44 



5 . Implications and Conclusion 
5.1 Challenges for Education 

Once gaps in content have been identified using this assessment (AEILESS), the 
instructor is then left to address any basic knowledge acquisition insufficiencies. There are many 
different academic resources and materials available covering environmental topics in life science, 
physical science and earth systems science for K-12, but there are fewer available for higher 
education. In other words, changes to curriculum and instruction in higher education will require 
time to adapt K-12 resources and materials. Very little research has been done examining the 
quality of environmental texts and curriculum in the United States. Erdogan (2009) has shown that 
the curricula in Bulgaria and Turkey are lacking in the behavior (action) component of EL, but are 
strong in knowledge. This may also be the case in American textbooks and curricula. It is up to the 
instructor to decide whether he or she wants to focus on broad environmental concepts the 
students are struggling with or whether it would be better to focus on an individual topic within 
the sub-domain, and then decided what pedagogical approach should be taken to emphasize a 
particular concept or domain. 

Using the Science Standards, each item on the instrument can be traced back to a specific 
skill the students should master. For example, if only 9% of students answer item one correctly, 
this question falls under the high school physical science standard. More specifically, the concept 
and skill the student should master with this question is, Energy exists in many forms such as 
mechanical, chemical, electrical, radiant, thermal and nuclear, that can be quantified and 
experimentally determined ("Colorado academic standards," 2009). This topic can be referenced 
in, for example, the text, Environment: The Science Behind the Stories. More specifically, in 
chapter 4: From Chemistry to Energy to Life (Withgott, 2009). Each question has a specific topic 
the instructor can focus on by reviewing the Science Standards. Another source is the AAAS 
website, which provides a plethora of concepts and ideas to cover under each content area. 



45 



However, this means that a) educators need to be familiar with the K-12 content science standards, 
and b) have the luxury of time to make these connections and change their teaching as well as 
assessment. 

5 .2 Limitations of Assessment 

Environmental knowledge can come from sources outside of education, such as family, 
media, peers and personal experience. With the multitude of factors impacting an individual's 
environmental literacy, it is nearly impossible to claim that literacy is a direct result of education. 
There is no question, however, that literacy is greatly impacted by the quality of education. 

No qualitative questions were included in the assessment as a means of testing whether 
students could answer open-ended questions using a combination of sciences' knowledge and 
thought. Qualitative questions require appropriate response mechanisms, giving insight to the 
respondent's attitude and possibly their individual actions and behaviors. In reality the 
environment is a holistic system, therefore the physical sciences and the social sciences should not 
be considered in isolation from one another. Students should be given opportunities to integrate, 
synthesize, and apply knowledge from the different content areas. In higher education, however, 
students are typically assessed by separating science from social studies, reading, writing, math, 
communicating and health. Future adaptations to the AEILESS tool should include, at the very 
least, social studies. 

Rather than add an environmental content section to the standards, the CDE have 
incorporated environmental topics into the biology standards. This integration has been openly 
accepted because the topics are profoundly interconnected. Biology and environmental science 
should be integrated in education, as should chemistry and earth sciences. Combined in education, 
they create a very strong candidate for the science field. A student who is able to make 
interdisciplinary connections between the sciences is more likely to solve complex biological 



46 



problems (Roth 1976; Stapp, 1976; Brogdon & Rowsey, 1977; Schneider 1997; Feig 2004). They 
have an advantage when using science tools from multiple fields. Specialization is not lost, but a 
new perspective, is gained. Unfortunately, the Cartesian-Newtonian concept of scientific 
modernism, with its fragmentation of the sciences, has only been reinforced throughout the 
decades. Environmental science integration and assessment is most likely an anomaly within the 
dominant educational paradigm. Hopefully the importance of interdisciplinary teaching and 
learning in the sciences finds a way into assessment practices in higher education. 

The Colorado Environmental Literacy plan includes competencies from, not just the 
Science content area, but also from Social Studies, including standards in History, Geography, 
Economics and Civics. Social studies are equally as important as the sciences when assessing 
environmental literacy. It is important that students, not only have knowledge about ecological 
processes and human impacts, but that they become active citizens interested in progressing their 
communities and government. Students need a sense of civic and personal responsibility to the 
environment. They must understand the social, economic and environmental conditions and 
injustices of humanity. It is a combination of ecological and social knowledge and experiences 
that contour students' attitudes, values and behavior. A second assessment should be created to 
cover the environmental social studies content such as population growth, environmental equity, 
environmental history, migration, urbanization and development. 



47 



The greatest limitation of the assessment is that it only assesses knowledge and skills of 
individuals. Environmental literacy is influenced by more than these two components, as you can 
see below, in Figure 5.1. 



Contexts 



Competencies 



Local, 

regional, or 

global 

situations that 

involve the 

environment 



Require yoj to: 



Identify 

environmental 

issues. 



Analyze 

environmental 

issues. 



Evaluate 
potential 
solutions to 
environmental 
issues 



Propose and 
justify actions 
that address the 
environmental 
issue. 



Influences 



y 



How you demonstrate 
competencies requires: 



\ 



Influences 



Environmental 
Knowledge 



What you know about: 

■ the physical, 
ecological system, 

■ environmental issues, 

■ sociopolitical 
systems, 

■ strategies for 
addressing 
environmental issues. 



Dispositions 
toward the 
Environment 



How you respond to 
environmental issues: 

■ interest, 

■ sensitivity, 

■ locus of control, 

■ responsibility, 

■ intention to act. 



Figure 5.1 PISA Framework for Assessing Environmental Literacy. The PISA 2015 framework 
emphasizes that competencies are influenced by both environmental knowledge as well as one's 
disposition toward the environment. 



A vital element in achieving environmental literacy is that an individual not only has the 
knowledge of ecological and social systems, issues and strategies, but that they have a positive 
disposition towards the environment. Future assessments of scientific knowledge or environmental 
literacy might be combined with measures of behavior, attitudes and dispositions toward the 
environment. 



48 



5 .3 Dispositions towards the environment 

Many individuals believe that they are environmentally literate yet when asked to 
describe nature they portray places absent from any human interference (Vining 2008). Others do 
not believe they are a part of nature at all. Humans have made an effort to control nature since the 
beginning of their existence. Some examples are the Agricultural Revolution the Industrial 
Revolution and the Technological Revolution. Although we have entered what is known as the 
Green Revolution, a continuation of technological advancements, humans seem to have lost their 
connection with their natural world. In nations that at are less developed and less industrialized, 
we can see symbiotic relationships with nature, reflecting an image of early Americans, pre- 
technological advancement. (Campbell 1983; Eliade 1964). 

The fact that many Americans do not acknowledge they are a part of nature may 
influence their environmental values and thus their actions towards the Earth (Dutcher 2007). 
Instead of respecting and seeing the value in indigenous ways, western cultures are continually 
pushing economic development and, indirectly, environmental destruction on less industrialized 
countries (Apffel-Marglin 1990, Mander 2006). The Dominant Social Paradigm (DSP) reinforces 
the view that western civilization has the most superior knowledge and culture. It also emphasizes 
that other nations' resistance to conform and develop stems from ignorance. However, Apffel- 
Marglin (1990) has shown that it is not actually superior cognitive power that enables modern 
knowledge to trump traditional knowledge, but economic and social prestige associated with 
western cultural history over the past 500 years. For many western societies, it is a difficult 
concept to grasp, that poor, indigenous people could be more environmentally literate. 

Those with environmental concerns are challenging the existing paradigm. Kilbourne 
(2002) has shown that the greater one believes in and values the DSP, their expressed concern for 
the environment decreases, showing an inverse relationship. Thanks to authors such as Thomas S. 
Kuhn, whose writings in the 1960's covered topics such as paradigm anomalies, crisis and shifts, 



49 



scientists began to exhibit different attitudes toward existing paradigms and started questioning 
their nature. Dunlap and Van Liere (1978) developed the New Environmental Paradigm (NEP) 
Scale to measure an individual's proenvironmental orientation. It's revision, the new ecological 
paradigm scale (Dunlap, Van Liere, Mertig, & Jones, 2000), was created to measure 
environmental attitudes, influenced by fundamental values and beliefs. Many assessments have 
since been created to assess the same issue (including Milfont 2009). As stated in Figure 5.1, it is 
my hope that learners demonstrate not only an increase in knowledge but also a shift in disposition 
from DSP to NEP. 

5 .4 Environmental values and beliefs 

According to Sean Esbjorn-Hargens (2009) western societies have six basic, heavily 
weighted values. In decreasing value they are: security, power, principle, profit, people and planet. 
It is ironic to me that people and planet would be at the bottom end of the scale. Farrior (2005) 
categorized environmental values into three broad categories: egoistic concerns, social alturuistic 
concerns and biospheric concerns. Egoistic concerns focus on one's own health, quality of life, 
prosperity and convenience. The social-alturuistic concerns focus on other people, such as 
children, family, community and humanity. Lastly, the biospheric concerns focus on the well 
being of non-human, living organisms such as flora and fauna. Centuries of efforts have been 
made to transform society's view of human dominion and the conquest of nature, falling under 
egoistic concerns. Although there have been a few environmentalist throughout history, it was not 
until the 21 st century that respect for the environment was brought about through a "deep-seated 
realization of the fact that we and all other entities are aspects of a single unfolding reality" (Fox, 
1990). 

Many writers and experts in the field of EE believe that environmental behavior is the 
ultimate goal of EE (eg. Childress and Wert 1978; Harvey 1977; Hungerford and Peyton 1976; 



50 



Hungerford, Peyton, and Wilke 1980; Rubba and Wiesenmayer 1985; Stapp 1978). After all, an 
individual's behaviors reveal whether they are considered operational in their environmental 
literacy. "Environmental literacy should be defined ... in terms of observable behaviors. That is, 
people should be able to demonstrate in some observable form what they have learned — their 
knowledge of key concepts, skills acquired, disposition toward issues, and the like" (Daudi, 1997). 
Western culture has, however, shown that an individuals' behavior is often disconnected from the 
attitudes or beliefs they hold. This term has been coined the attitude-behavior gap, that is, people 
show concern for cars and factories releasing toxins and pollutants into the environment, yet they 
continue to drive their cars and buy products that are not made sustainably (Campbell 1963). 

Airport (1935) defined an attitude as "a mental and neural state of readiness, organized 
through experience, exerting a directive or dynamic influence upon the individual's response to all 
objects and situations with which it is related". Behavior, on the other hand is the manner of 
conducting ones self. Although attitudes were once considered a direct precedent to behavior, this 
is no longer an accepted idea among social psychologists (Greve, 2001). 

Simply because an individual answers every question on the assessment correctly does 
not mean that s/he consistently engages in environmental behaviors. "Individual and societal 
environmental behavior belies the assumption that behavioral change follows directly from 
development of necessary knowledge and skills" (Iozzi, 1989). Ultimately, there are many factors 
that have been found to influence pro-environmental behavior including: demographic factors, 
external factors (e.g. institutional, economic, social and cultural), and internal factors (e.g. 
motivation, pro-environmental knowledge, awareness, values, attitudes, emotion, locus of control, 
responsibilities and priorities) (Kollmuss, 2002). Imagine environmental knowledge as the tip of 
an enormous iceberg. The iceberg itself is environmental literacy, which necessitates the creation 
of multiple assessments corresponding to each of its 'under water' components, and not 
exclusively the 'visible' environmental knowledge. 



51 



6. Epilogue 

Personally, I have found my place in Environmental Education. I will undoubtedly spend 
the rest of my life teaching courses on systems thinking, multicultural environmental 
communication, atmospheric science, ecology, green technology and sustainability . It is my hope 
that our future generations will have a powerful connection to their living and nonliving 
surroundings, have a strong sense of community, leadership and advocacy, and that they are able 
to use their environmentally literate minds to protect and restore the Earth's balance. My hope is 
that the instrument I have created, Assessing the Environmental Literacy of Intro Environmental 
Science Students, will point educators in the right direction and give students a more focused and 
personal curriculum and in the end, a meaningful educational experience for all. 



52 



APPENDIX A. 
Geographic Dispersion of survey respondents 



53 






h^Hi* 


BMW 


Hr ■! 


MM 


.„. 



Figure 2.1 Geographic dispersion of survey respondents. The map illustrates the geographic 
dispersion of respondents who completed the survey in Colorado. Yellow represents 1-2 
respondents, Light Green represents 3-5 respondents, Dark Green represents 6-15 respondents and 
Blue represents 15+ respondents (Navin, 2010). 



54 



APPENDIX B. 
Introduction to Environmental Science Syllabus 



55 



ENVS 1042: Introduction to Environmental Science 
Monday and Wednesday 12:30 to 1:45 and 2:00 - 3:15 
Tentative Syllabus 
Instructor: Dr. Jon Barbour 

Department of Geography and Environmental Sciences 
Office: North Classroom 3622. 

Phone: 303-556-4520 
Email: jon.barbour@cudenver.edu 

Office hours: Monday and Wednesday 8:00 - 9:00 a.m. or by appointment. 
Course Information Website: http://clasfaculty.ucdenver.edu/jbarbour/ 

TEXT Withgott and Brennan. Environment: The Science Behind the Stories 3 rd Edition, 
Pearson Education Inc. San Francisco 

PREREQUISITES: There are no formal prerequisites. Some basic math and science skills, 
as well as familiarity with the use of library resources will required. 

COURSE DESCRIPTION: The major objective of this course is to provide students with the 
tools and background information required to reasonably understand and discuss environmental 
issues facing current and future generations. The course also serves as an introductory course 
for the Earth & Environmental Sciences (EES) degree option within Geography. This course 
will cover basic biology, chemistry, physics, and ecological science that determine the Earth's 
environment in which we live today. 

MEASURABLE STUDENT LEARNING OBJECTIVES: 

Understanding of: 

1 . The basic science disciplines that are involved in Environmental Science. 

2. Functioning of the major systems and processes that are active in the Earth's 
environment. 

3. What is sustainability and what are the factors involved in achieving it. 

4. How we as human society may achieve and maintain both energy and environmental 
sustainability. 

Technical and analytical skills: 

1 . Basic research skills in researching, compiling and organizing information from 
libraries, the world wide web, scientific journals and databases. 

2. Synthesize and analyze information from different sources and points of view. 

TENTATIVE COURSE SCHEDULE: 

Wednesday 1/19 Class introduction 

Monday 1/24 An Introduction to Environmental Science (Chap 1) 

Wednesday 1/26 Environmental Ethics and Economics (Chap 2) 

Monday 1/31 Environmental Policy (Chap 3) 

Wednesday 2/2 From Chemistry to Energy to Life (Chap 4) 



Figure 3.2 ENVS 1042: Introduction to Environmental Science Syllabus 



56 



Monday 2/7 Evolution, Biodiversity, and Population Ecology (Chap5) 

Wednesday 2/9 Species Interactions and Community Ecology (Chap 6) 

Monday 2/14Environmental Systems and Ecosystem Ecology (Chap 7) 

Wednesday 2/16 Human Population (Chap 8) 

Monday 2/21 Soil and Agriculture (Chap 9) 

Wednesday 2/23 Agriculture, Biotechnology, and the Future of Food (Chap 10) 

Monday 2/28 Sustaining Biodiversity (Chap 11) 

Wednesday 3/2 Review for Mid Term Exam 

Monday 3/7 Mid Term Exam 

Wednesday 3/10 Return and Review Exam 

Monday 3/14Resource Management (Chap 12) 

Wednesday 3/16 Urbanization and Creating Livable Cities (Chap 13) 

Monday 3/2 1 NO CLASS SPRING BREAK 

Wednesday 3/23 NO CLASS SPRING BREAK 

Monday 3/28Environmental Health and Toxicology (Chap 14) 

Wednesday 3/30 Freshwater Resources (Chap 15) 

Monday 4/4 Marine and Costal Systems (Chap 16) 

Wednesday 4/6 Atmospheric Science and Air Pollution (Chap 17) 

Monday 4/1 1 Global Climate Change (Chap 18) 

Wednesday 4/13 Fossil Fuels, Their Impacts, and Energy Conservation (Chap 19) 

Monday 4/ 18 Conventional Energy Alternatives (Chap 20) 
Wednesday 4/20 New Renewable Energy Alternatives (Chap 21) 

Monday 4/25Waste Management (Chap 22) 
Wednesday 4/27 Sustainable Cities (Chap 23) 

Monday 5/2 Make up day for snow etc. 
Wednesday 5/4 Review for Final Exam 

FINAL EXAM (Comprehensive) According to Finals Schedule 

PLEASE NOTE: You must pass both lab and lecture sections to pass the course, i.e. you 
must obtain at least 60% of the points in lab (180) and lecture (240) to pass. Also, you 
must pick up your mid-term exam when handed back or 10 points will be deducted from 
your exam. 

Total points: 700 points distributed as follows: 
Exams: 

Mid Term Exam 100 
Comprehensive Final Exam 200 
Quizzes: 

There will be 5 unannounced quizzes during the term. 
Each will be 20 points for a total of 100 points. 
Total points from labs 
300 
You must register for a lab section as part of this course. The lab points are entirely 
determined by the lab instructor. 



Figure 3 .2 (Continued) 



57 



APPENDIX C. 
AELIESS assessment instrument 



58 



Title: Assessing the Environmental Literacy of Intro Environmental Science Students 
Date: 1/18/2012 

Student Information: Gender: male female Ethnicity: 

Age: 

Did you graduate High School? Yes/No Are you an ENVS major? Yes/No 
How many years of K-12 was attended in Colorado? 

DIRECTIONS: Multiple-Choice: please circle one answer for each question. 

1. Consider the following situations: 

Situation 1 : A battery is used to power a cell phone. 

Situation 2: The sun shines on a plant. 

Is energy being transferred in either of these situations? 

A. Energy is transferred in both situations. 

B. Energy is NOT transferred in either situation. 

C. Energy is transferred when a battery is used to power a cell phone, but energy is NOT 

transferred when the sun shines on a plant. 

D. Energy is transferred when the sun shines on a plant, but energy is NOT transferred when a 

battery is used to power a cell phone. 

2. The thermal energy of an object depends on which of the following? 

A. Both the temperature of the object and the material it is made of 

B. The temperature of the object but not the material it is made of 

C. The material the object is made of but not the temperature of the object 

D. Neither the temperature of the object nor the material it is made of 

3. Which of these is a renewable resource? 

A. Wood, because trees grow again 

B. Gold, because more can be made very easily 

C. Petroleum, because it can be refined into gasoline 

D. Coal, because more can be made in about 100 years 

4. Which energy transformation occurs first in a coal-burning power plant? 
A Chemical energy to thermal energy 

B Thermal energy to mechanical energy 
C Thermal energy to electrical energy 
D Mechanical energy to electrical energy 

5. Coal, petroleum, and natural gas found underground in certain parts of Earth are primarily 
formed from which process? 

A. Decay of radioactive elements 

B. Collision of tectonic plates in earthquakes 

C. Transformation of dead plants and animals under heat and pressure 

D. Intrusion of water into the soil that breaks up rocks and minerals 

Figure 3.3 AELIESS assessment instrument6. Which of the following is TRUE about the 
extinction of species? 



59 



A. Very few species have ever become extinct. Most continue to exist. 

B. There have been extinction events in which many species became extinct at about the same 

time. Aside 
from these, extinction is very rare. 

C. Up until recently, species rarely became extinct. Humans have caused the majority of 

extinctions. 

D. Many species have become extinct throughout the history of life on earth. 

7. Which of the following is TRUE about how changes can happen to the physical environment of 

earth? 

A. Changes can happen suddenly or gradually. 

B. Changes can happen suddenly but not gradually. 

C. Changes can happen gradually but not suddenly. 

D. Changes can happen neither gradually nor suddenly because the environment does not change. 

8. Which of the following is food for a plant? 

A. Sugars that a plant makes 

B. Minerals that a plant takes in from the soil 

C. Water that a plant takes in through its roots 

D. Carbon dioxide that a plant takes in through its leaves 

9. Because they are rapidly being cut down, the rain forests today are endangered ecosystems. 
How might widespread destruction of the rain forests affect other ecosystems in the world? 

A. by increasing the amount of available soil 

B. by reducing the amount of available oxygen 

C. by increasing the diversity of plant and animal life 

D. by reducing the amount of available carbon dioxide 

10. When the environment changes more quickly than a species can adapt, the species may 
become 

A. extinct 

B. diverse 

C. dominant 

D. overpopulated 

11. The diagram below shows the feeding relationships between populations of plants and animals 
in an area. The arrows point from the organisms being eaten to the organisms that eat them. 



Figure 3.3 (Continued) 



60 



A new species that eats only mice becomes part of this food web, greatly reducing the number of 
mice in this area. Using only the relationships between the plants and animals shown in the 
diagram, what effect would the new species have on the caterpillar population if the number of 
foxes stays the same? 

A. The number of caterpillars would increase. 

B. The number of caterpillars would decrease. 

C. The number of caterpillars would stay the same. 

D. There is not enough information to tell what would happen to the number of caterpillars. 

12. Which of the following statements about competition between animals is TRUE? 

A. Competition may involve two lions fighting over prey but not two cows eating grass in the 

same field. 

B. Competition may involve two birds fighting over a nesting site but not one bird placing its eggs 
in the nest of another. 

C. Competition may involve two birds fighting over a nesting site, two lions fighting over prey, or 
one bird placing its eggs in the nest of another but not two cows eating grass in the same field. 

D. Competition may involve two birds fighting over a nesting site, two lions fighting over prey, 
one bird placing its eggs in the nest of another, or two cows eating grass in the same field. 

13. As the energy needs for Colorado increase, new sources of energy are required to replace or 
supplement the nonrenewable sources of energy now in use. 

Two sources of energy that are renewable and available in Colorado are — 

A. natural gas and wind power 

B. coal and hydropower 

C. petroleum and solar power 

D. wind power and solar power 

14. Which form of energy strikes the best balance between energy production and environmental 
impact? 

A) solar 

B) tidal 

C) nuclear 

D) algae biofuel 

15. The greenhouse effect presents some concern to humans but it is also an important part of 
Earth's ecosystem. Why is this? 

A. It makes Earth habitable by cooling its atmosphere. 

B. It makes Earth habitable by warming its atmosphere. 

C. It helps screen out harmful radiation from the sun. 

D. It prevents carbon dioxide from escaping Earth's atmosphere. 

16. Which of these has the LEAST influence on an area's climate? 

A. latitude 

B. elevation 

C. soil conditions 

D. adjacent large bodies of water 
Figure 3 .3 (Continued) 



61 



Item 


1 


2 


3 


4 


5 


6 


7 


8 


9 


10 


11 


12 


13 


14 


15 


16 


Answers 


A 


A 


A 


A 


C 


D 


A 


A 


B 


A 


A 


D 


D 


D 


B 


C 



Figure 3 .3 (Continued) 



62 



APPENDIX D. 
AELIESS questions chosen using Colorado academic standard outline 



63 



Table 2.1 AELIESS Questions chosen using the Colorado Academic Standard's outline of critical 
concepts and skills for K-12 ("Colorado academic standards," 2009). 

Questions 1-3: Physical Science. Were created using: 

Content Area: ScienceGrade Level Expectations: High SchoolStandard: 1. Physical Science 
Concepts and skills students master: 

1 . Energy exists in many forms such as mechanical, chemical, electrical, radiant, thermal, 
and nuclear, that can be quantified and experimentally determined 

Questions 4-13: Life Science. Were created using: 

Content Area: ScienceGrade Level Expectations: High SchoolStandard: 2. Life Science; 
Content Area: ScienceGrade Level Expectations: Sixth GradeStandard: 2. Life Science 
Concepts and skills students master: 

2. Matter tends to be cycled within an ecosystem, while energy is transformed and 
eventually exits an ecosystem 

3 . The size and persistence of populations depend on their interactions with each 
other and on the abiotic factors in an ecosystem 

4. The energy for life primarily derives from the interrelated processes of photosynthesis 
and cellular respiration. Photosynthesis transforms the sun's light energy into the 
chemical energy of molecular bonds. Cellular respiration allows cells to utilize chemical 
energy when these bonds are broken. 

5. Changes in environmental conditions can affect the survival of individual organisms, 
populations, and entire species 

6. Organisms interact with each other and their environment in various ways that create a 
flow of energy and cycling of matter in an ecosystem 



64 



Table 2.1 (Continued) 



Questions 14-16: Earth Systems Science. Were created using: 



Content Area: ScienceGrade Level Expectations: High SchoolStandard: 3. Earth Systems Science; 
Content Area: ScienceGrade Level Expectations: Eighth GradeStandard: 3. Earth Systems Science 



Concepts and skills students master: 



1 . Climate is the result of energy transfer among interactions of the atmosphere, 
hydrosphere, lithosphere, and biosphere 

2. There are costs, benefits, and consequences of exploration, development, and 
consumption of renewable and nonrenewable resources 

3. Earth has a variety of climates defined by average temperature, precipitation, 
humidity, air pressure, and wind that have changed over time in a particular location 



65 



APPENDIX E. 
Studies assessing aspects of EL 



66 



Table 3.1 A selection of studies that assess instructional effectiveness concerning aspects of EL. 

(Hungerford, 2005, p.76-77) 





A Selection of Studies Which 
Concerning Aspects 


Assessed Instructional Effectiveness 
of Environmental Literacy 






Study 


Validity/ 
Reliability 


Independent 
Variable 


Subject 
Grade/Age 


Dependent 
Variable 


Significant 
Effect 




Adams et a!., 1937 


No/No 


Biology course 


High School 


Attitude 


+ 




Armstrong & Impara, 
1991 


Yes/Yes 


Supplemental 
instruction 


5,7 


Attitude, 

Ecological Knowledge 


Mixed 






Bennet, 1982 


Yes/Yes 


Social studies 
program 


Junior high 
school 


Socio- Political Knowledge 


+ 






Benton, 1993 


No/Yes 


Environmental 

management 

course 


College 


Attitude, 

Environmental Issue Knowledge, 
Additional Determinants, 
Responsible Behavior 


+ 
+ 
+ 
+ 






Birch & Schwaab, 
1983 


Yes/Yes 


Instructional unit 


7 


Attitude, 

Environmental Issue Knowledge 


+ 

+ 






Brothers et al. , 1991 


Yes/No 


Television 
documentary 


Adults 


Attitude 

Environmental Issue Knowledge 


+ 
+ 






Burrus-Bammd & 
Bammel. 1986 


Yes/Yes 


Residential camp 


16-20 
years 


Attitude 

Ecological Knowledge 


+ 
+ 






Collins el al., 1978 


No/Yes 


Field, trip with 
activities 


4.5,6 


Attitude 


+ 






Crater & Mears, 1981 


No/No 


Instructional unit 


8 


Attitude, 

Environmental Issue Knowledge 


+ 
+ 






Dresner, 1989/90 


No/No 


Simulation game 


College 


Attitude, 

Additional Determinants 


- 






Dunlop, 1979 


Yes/Yes 


Simulator 


Teachers 


Attitude 


- 






Former &Lahm, 1990 


Yes/Yes 


Instruction 
(in-classroom 
information 
and site visit) 


4,5 


Attitude, 

Ecological Knowledge 


+ 






Former* Lyon, 1985 


Yes/No 


Television 
documentary 


Adults 


Attitude, 

Ecological Knowledge 


+ 
+ 






Gelkr, 1981 


No/No 


Workshop 


Adults 


Attitude, 

Additional Determinants, 

Responsible Behavior 


+ 

+ 






Glass, 1981 


No/Yes 


"Workshop 


Teachers 


Attitude, 

Environmental Issue Knowledge 


+ 






Jans, 1982 


No/Yes 


Instruction 


5 


Attitude 


+ 






Jans, 1984 


Yes/Yes 


Instruction 


3 


Attitude 


+ 






Jordan, et al„ 1986 


Yes/No 


Residential camp 


High 
School 


Socio-Political Knowledge, 
Responsible Behavior 


+ 
+ 






Kiddetal., 1978 


No/No 


Forest camp 


16-20 years 


Attitude, 

Ecological Knowledge 


+ 
+ 






Kinsey & Wheatley, 
1984 


No/No 


Environmental 
studies course 


College 


Attitude 


Mixed 






Lawrenz, 1985 


No/Yes 


Workshop 


Teachers 


Attitude 


- 






Marshdoyleet al., 1982 


No/No 


Field trip 


4.5,6 


Ecological Knowledge 


+ 






Mills etal., 1985 


Yes/Yes 


Computer 
simulation 


Teachers 


Attitude, 

Environmental Issue Knowledge 


+ 





















67 



Table 3.1 (Continued) 



Table continued: 


A Selection 


of Studies Which Assessed Ins 


ructional 




Effectiveness 


Concerning 


Aspects of Environmental Literacy 






Study 


Validity/ 
Reliability 


Independent 
Variable 


Subject 
Grade/Age 


Dependent 
Variable 


Significant 
Effect 




Milton et al. 1995 


No/Yes 


Park/school 
program 


5 


Attitude, 

Ecological Knowledge 


+ 




Pomerantz, 1986 


Yes/No 


Children's nature 
magazine 


5 


Ecological Knowledge 


+ 






Ramsey & Hungerford, 
1989 


Yes/Yes 


Instruction 
(investigation 
and action) 


7 


Attitude, 

Socio-Political Knowledge, 
Cognitive Skills, 
Additional Determinants, 
Responsible Behavior 


+ 
+ 






Ramsey, 1993 


Yes/Yes 


Instruction 
(investigation 
and action) 


8 


Attitude, 

Socio-Polidcal Knowledge, 
Cognitive Skills, 
Additional Determinants, 
Responsible Behavior 


+ 

+ 

Mixed 

+ 






Ramsey etal., 1981 


No/No 


Instruction 
(investigation 
and action) 


8 


Socio-Political Knowledge, 
Responsible Behavior 


+ 
+ 






Ross & Driver, 1986 


No/No 


Youth 

Conservation 
Corps program 


15 -18 years 


Attitude, 

Environmental Issue Knowledge, 

Responsible Behavior 


+ 
+ 
+ 






Shepard & Speelman, 
1985/86 


No/No 


Outdoor education 
program 


9-14 years 


Attitude 


- 






Simmons, 1984 


No/No 


Presentation 
methods (on-site 
vs. simulated visit) 


Adults 


Attitude, 

Environmental Issue Knowledge 


+ 
+ 






Smith-Sebasto, 1995 


Yes/Yes 


Environmental 
studies course 


College 


Socio-Political Knowledge, 
Cognitive Skills, 
Additional Determinants, 
Responsible Behavior 








Stapp et al, 1983 


No/No 


Middle school 
curriculum 


6,7 


Attitude, 

Environmental Issue Knowledge, 

Cognitive Skills 

Additional Determinants 


■t- 
+ 






Strickland el al. 
1983/84 


Yes/Yes 


Instruction 


3-5 years 


Environmental Issue Knowledge 


+ 






Trent, 1978 


No/Yes 


Workshop 


Teachers 


Attitude, 

Environmental Issue Knowledge 


+ 






Volk & Hungerford, 
1981 


No/Yes 


Instruction 


8 


Environmental Issue Knowledge, 
Cognitive Skills 


+ 
+ 






Westphal & Halverson, 
1985/86 


No/No 


Workshop 


Adults 


Environmental Issue Knowledge, 
Responsible Behavior 


+ 
+ 






Wilson & Tomera, 
1980 


Yes/Yes 


Supplemental 
case study 


High school 


Attitude 


- 





















68 



APPENDIX F. 
EL contexts and distributions 



69 



Table 3.2 Contexts for environmental literacy. The following table was taken from the PISA 
environmental literacy framework and used to develop items on AELIESS (Hollweg et al., 201 1). 





Local 


Regional 


Global 


Biodiversity 


Flora and fauna 


Endangered species. 


Ecological 






habitat loss, exotic 


sustaiuability, 






invasive species 


sustainable use of 
species 


Population Growth 


Growth, birth' death. 


Maintenance of 


Population growth 




emigration. 


human population. 


and its social. 




immigration 


population 


economic, and 






distribution, over 


environmental 






population 


consequences 


Natural Resources 


Personal 


Production and 


Sustainable use of 




consumption of 


distributions of 


renewable and non- 




materials 


food, water, energy 


renewable resources 


Environmental 


Impact of use and 


Disposal of sewage 


Sustainability of 


Quality and Health 


disposal of materials 


and solid waste, 


ecosystem services 




on air and water 


environmental 






quality 


impact 




Natural Hazards and 


Decisions about 


Rapid changes (e.g. 


Climate change, 


Extreme Weather 


housing in areas 


earthquakes), slow 


extreme weather 




vulnerable to 


changes (coastal 


events 




flooding, tidal and 


erosion), risks and 






wind damage 


benefits 




Land Use 


Conservation of 


Impact of 


Production and loss 




agricultural lands 


development and 


of topsoil. loss of 




and natural areas 


diversion of water, 
watersheds, and 
flood plains 


arable land 



Table 3.3 Distributions of contexts: The items that include this context, as well as the percentage 
of each context represented in the assessment AELIESS. 



Context 


Biodiversity 


Natural 


Environmental 


Natural 


Land Use 






Resources 


Quality and 
Health 


Hazards and 

Extreme 

Health 




Items 












including 


6,8,9,10, 


1,2,3,4,5, 


7,13,14,15 


7,10,15 


9,16 


context 


11,12 


13,14 








% Items 












containing 


37.5 


43.75 


25 


18.75 


12.5 


context 













70 



APPENDIX G. 
IRB approval letter 



71 




Cole rata Mdlr*a Irnblulom! Rivhb bawd, CS F4ED 

UiTiB-iiT^I Onbiacta.Aractmili Urtcd Canfna 
IBB I3IK11 £. ITlh^Mca. BuUnq MO ftmn MUM 



303 7?4 -■=-■-• I rc\ 
COY RE Kill* P-i- |Wcs| 

■ ...- »-: .,. ! |.-:..- ■■.;■ i-Mi.ll 



Lhf.jr jfcv 4l CflknfrHPSClllJ 
Dflrwar HMlh MrtUH Ctfflir 
VflMim^-AdnhWraiirn M&dtail Centar 

L'nuuiihi' p'CriaiuAi baimr 
l5c ta i mm fv»™ rJ e n £ m li r 



Ba-Dec-2011 



Certificate of K\tni ptitm 



Jnv^htigatDr: 

Sponsor's): 

Subjuet: 

Jtffrctivc Dale: 

AntiripatL'd I'wnapletion llatu: 

iixcmp-t l'atc|"orv: 

lille: 



Randi Hocdjcn 

CUM1RB Protocol 3 ]- 1474 Initial. Application 
Oti-Dec-201 I 

U6-Dee-2U]4 

I 

Assessing The Environ men Lai Literacy OJ Jnlro Environment;!] SriEnce Students 



I his protocol qua! irks lor e Kemp I stulus. Periodic aintinuinp review t£ not required. For Liu* dmaunn of yrnir protocol, any 
LJi-inpL jn 1he eiperircienLiL desi cnA: iwLent d'Hiia sludy musL be- approved by the CQMtRB bdore implementation ol" the changes. 

The jrjtk'ipijted LompleLiori dale uJ 1his protoml is Lifi-D-cv-2l.'i34. COM1RE ^il] Lidn"imi<r:jli'.eLy ■.!■.■■<<: Ih is prvju:! uci lni.< dale 
unless otherwise instructed either by correspondence, telephone or e-mail lev C.'OMlR.BGtf Ufidenvenedu. [f the project is dosed 
"■ill-." mi ::..-. .H.-. :k .^,- rmiil;. :!u <i 1M1.R.K nJlia hi ■■■ nUns n: tn .■ :i il mu : M. ■ |u- i-.\ : h. - v.- i .. I • - - -_- a_ I 

Yffu will be contacted every 5 years for a siatus report on this project. 

Ajiy questions regarding ihe CX3MIRE artitin of Lbis study should be referred to the COMIfEB staff at 3D0-724- IfiSS or 
LCHSL Bitjl K-4W. 



Run^ timi minis: 

TtiiK ExempL Approval Includes - This submission wus f-ubmitLed us Expedited bin was delermined to- quality as Exempt (till 
aFnaerniienls are nVs ■with the fcaeinpl approval. Atlachmejit O is also JX7A] - v. 1 2/] /2U 1 I:- 

.'■,rxi..i\.:i". 
Information Sheet 

AfiiLialed Site - DownSown Denver Campus 

Sincerely. 
UCD Panel S 



72 



BIBLIOGRAPHY 

American Association for the Advancement of Science (2001a). Atlas of Science Literacy. 
Washington, DC: American Association for the Advancement of Science. 

Arcury, T. A. (January 01, 1990). Environmental attitude and environmental knowledge. Human 
Organization. 

Allport, G. W. (1935). A handbook of social psychology (c. murchison, ed.). Worcester, MA: Clark 
University Press. 

Apffel-Marglin, F., & Marglin, S. A. (1990). Dominating knowledge: Development, culture, and 
resistance. Oxford: Clarendon Press. 

Brogdon. R., & Rowsey , R. (1977). Some effects of an interdisciplinary environmental education 
effort. Journal of Environmental Education. 8,3, 26-31 . 

Campbell, D. T. (1963). Social attitudes and other acquired behavioral dispositions. In s. Koch 
(ed.). Psychology: A study of a science, (6), 94-172. New York: McGraw-Hill. 

Campbell, J. (1983). The Way of Animal Powers. New York: A van der Marck. 

Childress, R. B., & Wert, J. (1978). Challenges for environmental education planners. The Journal 
of Environmental Education, 7, 4, 2-6. 

Colorado Alliance for Environmental Education (CAEE), (201 1). Colorado environmental literacy 

plan draft 21 . Retrieved from website: http://www.caee.org/colorado-environmental- 
education-plan 

Colorado Department of Education, Office of Standards and Assessment. (2009). Colorado 
academic standards: science. Retrieved from website: 
http ://www .cde .state .co .us/index stnd-access .htm 

Colorado Department of Education, (201 1). Csap I tcap: Assessment window. Retrieved from 
website: http://www .cde .state .co .us/assessment/CoAssess- Assessment Window .asp 

Colorado environmental literacy plan: task force meeting 3. (10, 12 2010). Retrieved from 
http://eeforeveryone.wetpaint.com/page/Task Force Meeting #3 10-12-10 

Coyle, K., & National Environmental Education & Training Foundation. (2005). Environmental 
literacy in America: What ten years of NEETF/Roper research and related studies say 
about environmental literacy in the U.S. Washington, D.C: NEETF. 

Covey, S. R. (1991). Principle-centered leadership. New York: Summit Books. 

Daudi S. S., Heimlich, J. E. (1997). Advancing education & environmental literacy. EETAP 
Resource Library. 



73 



Disinger, J. F. (January 01 , 1985). What Research Says: Environmental Education's Definitional 
Problem. School Science and Mathematics, 85, 1, 59-68. 

Disinger, J., & Roth, C. (1992). Environmental literacy. Columbus, OH: ERIC Science, 
Mathematics, and Environmental Education Clearinghouse. (ERIC Document 
Reproduction Service No. ED 35120) 

Dunlap, R.E., Van Liere, K.D. (1978). The "new environmental paradigm": a proposed measuring 
instrument and preliminary results. Journal of Environmental Education, 9, 4, 10-19. 

Dunlap, R. E., Van, L. K. D., Mertig, A. G., & Jones, R. E. (January 01 , 2000). New Trends in 
Measuring Environmental Attitudes: Measuring Endorsement of the New Ecological 
Paradigm: A Revised NEP Scale. Journal of Social Issues, 56, 3, 425-442. 

Dutcher, D., Finley , J., Luloff, A. E., & Johnson, J. (January 01 , 2007). Connectivity With Nature 
as a Measure of Environmental Values. Environment and Behavior , 39, 4, 474-493. 

Eliade, M. (1964). Shamanism: Archaic Techniques of Ecstasy. Princeton, NJ: Princeton 
University Press. 

Erdogan, M., Kostova, Z., & Marcinkowski, T. (February 01 , 2009). Components of environmental 
literacy in elementary science education curriculum in Bulgaria and Turkey. Eurasia 
Journal of Mathematics, Science and Technology Education, 5, 1, 15-26. 

Erickson, R. J. (1997). Paper or plastic?: Energy, environment, and consumerism in Sweden and 
America. Westport, Ct: Praeger. 

Farrior, M. (2005). Breakthrough strategies for engaging the public: Emerging trends in 

communications and social science. Chicago: Biodiversity Project. Published on the 
Internet. Available at 
http://www.biodiversityproiect.org/docs/publicationsandtipsheets/breakthroughstrategiesf 

orenBagingthepublic.pdf [ accessed 20 June 2010]. 

Feig A. L. (2004). Challenge your teaching. Nature Structural & Molecular Biology. Nature 
Publishing Group. 11, 1, 16-19. 

Fox, W. (1990). Toward a transpersonal ecology. Boston: New Science Library. 

Goleman, D. (2009). Ecological intelligence: How knowing the hidden impacts of what we buy can 
change everything. New York: Broadway Books. 

Golley, F. B. (1998). A primer for environmental literacy. New Haven: Yale University Press. 

Greve, W. (2001). Traps and gaps in action explanation: Theoretical problems of a psychology of 
human action. Psychological Review, 108,435-451. 

Hart,E. P. (December 07, 1981). Identification of Key Characteristics of Environmental Education. 
Journal of Environmental Education, 13, 1, 12-16. 



74 



Harvey, G. D. (1977a). A conceptualization of environmental education. In J. Aldrich, A. 

Blackburn, and G. Abel (Eds.), A Report on the North American Regional Seminar on 
Environmental Education (pp. 66-72). Columbus, OH: ERIC Clearinghouse for Science, 
Mathematics, and Environmental Education. 

Harvey, G. D. (1977b). Environmental Education: A delineation of substantive structure. (Doctoral 
dissertation, Southern Illinois University, 1976). Dissertation Abstracts International, 
38(2), 61 1 A. (UMI No. 77-16622) 

Hazen, R. & Trefic, J. (1991). Science Matters: Achieving Scientific Literacy. Anchor Books. 

Hollweg, K. S., Taylor, J. R., Bybee, R. W., Marcinkowski, T. J., McBeth, W. C, & Zoido, P. 

(201 1). Developing a framework for assessing environmental literacy. Washington, DC: 
North American Association for Environmental Education. Available at 
http ://ww w .naaee .net . 

Hungerford, H. R. (January 01, 1975). Myths of Environmental Education. Journal of 
Environmental Education, 7,2, 21-26. 

Hungerford, H. R., & Center for Instruction, Staff Development and Evaluation. (2005). Essential 
readings in environmental education. Champaign, IL: Stipes Pub. 

Hungerford, H. R., & Peyton, R. B. (1976). Teaching environmental education. Portland, Me: J. 
Weston Walch. 

Hungerford, H.R., Peyton, R.B., & Wilke, R.J. (1980). Goals for curriculum development in 
environmental education. The Journal of Environmental Education, 1 1(3), 42-47. 

Hungerford, H.R., R. B. Peyton, & R.J. Wilke. (1983). Yes, Environmental Education Does Have 
Definition and Structure. Journal of Environmental Education, 14, 3, 1-2. 

Hungerford, H. R., & Tomera, A. N. (1977). Science in the elementary school: A worktext. 
Champaign, 111: Stipes. 

Hungerford, H. R., Bluhm, W. J., Volk, T. L., & Ramsey, J. M. (EDS.). (2005). Essential readings 
in environmental education. Champaign, IL: Stipes. 

Iozzi, L. A. (June 06, 1989). What Research Says to the Educator. Part One: Environmental 

Education and the Affective Domain. Journal of Environmental Education, 20, 3, 3-9. 

Kilbourne, W. E., Beckmann, S. C, & Thelen, E. (2002). The role of the dominant social paradigm 
in environmental attitudes: A multinational examination. 

Kollmuss, A., & Agyeman, J. (August 01 , 2002). Mind the Gap: why do people act 

environmentally and what are the barriers to pro-environmental behavior?. Environmental 
Education Research, 8,3, 239-260. 

Loubser, C. P., Swanepoel, C. H., & Chacko, C. P. C. (January 01 , 2001). Concept formulation for 
environmental literacy. South African Journal of Education, 21, 317-323. 



75 



Mander, J., Tauli-Corpuz, V., & International Forum on Globalization. (2006). Paradigm wars: 
Indigenous peoples' resistance to globalization. San Francisco: Sierra Club Books. 

McBeth, W., & Volk, T. L. (January 01 , 2010). The National Environmental Literacy Project: A 
Baseline Study of Middle Grade Students in the United States. Journal of Environmental 
Education, 41, 1,55-67. 

McKibben, B. (2007). Deep economy: The wealth of communities and the durable future. New 
York: Times Books. 

Milfont, T. L., & Duckitt, J. (March 01, 2010). The environmental attitudes inventory: A valid and 
reliable measure to assess the structure of environmental attitudes. Journal of 
Environmental Psychology , 30, 1, 80-94. 

Milfont, T. L. (January 01, 2009). The effects of social desirability on self-reported environmental 
attitudes and ecological behaviour. The Environmentalist, 29, 3, 263-269. 

Miller, J. D., & Northern Illinois University. (1989). Scientific literacy. DeKalb, 111: Northern 
Illinois University , Public Opinion Laboratory . 

Miller, J. D. (201 1). The conceptualization and measurement of civic science literacy for the 

twenty-first century. In J. Meinwald & J. G. Hildebrand (Eds.), Science and the Educated 
American A Core Component of Liberal Education (pp. 241-255). American Academy of 
Arts and Sciences. 

National Research Council (U.S.). (1996). National Science Education Standards: Observe, 
interact, change, learn. Washington, DC: National Academy Press. 

Navin, K. (2010, 10). Caee celp survey summary. Paper presented at Caee task force meeting #3. 

Negev, M., Sagy, G., Garb, Y., Salzberg, A., & Tal, A. (December 01 , 2008). Evaluating the 
environmental literacy of Israeli elementary and high school students. Journal of 
Environmental Education, 39, 2, 3-20. 

(201 1). No child left inside act of 2011 (NCLI) (S.1372 H.R.2547). Retrieved from website: 
http://www.cbf .org/document.doc?id=790 

Payne, R. K., Ryu'koku Daigaku., & Institute of Buddhist Studies (Berkeley, Calif.). (2010). How 
much is enough?: Buddhism, consumerism, and the human environment. Somerville, MA: 
Wisdom Publications. 

Project 2061 (American Association for the Advancement of Science). (1993). Benchmarks for 
science literacy. New York: Oxford University Press. 

Roth, C. E. (1992). Environmental literacy: Its roots, evolution and directions in the 1990s. 

Columbus: Ohio State University, ERIC Clearinghouse for Science, Mathematics, and 
Environmental Education. 



76 



Roth, R. E. (1976). A review of research related to environmental education. 1973-1976. 

Columbus, Ohio: ERIC Clearinghouse for Science, Mathematics, and Environmental 
Education. Ohio State University. 



Rubba, P. A., & Wiesenmayer, R. (1985). A goal structure for precollege STS education: A 

proposal based upon recent literature in environmental education. The Bulletin of Science, 
Technolog and Society, 5, 6, 573-580. 

Rutherford, F. J., & Ahlgren, A. (1990). Science for all Americans. New York: Oxford University 
Press. 

Schneider, S. H. (November 01 , 1997). Defining and teaching environmental literacy. Trends in 
Ecology & Evolution, 12, 11, 457. 

Shin, D., Chu, H., Lee, E., Ko, H., Lee, M., Kang, K., Min, B., & Park, J. (2005). An assessment of 
Korean students' environmental literacy. Journal of the Korean Earth Science Society, 26, 
4,358-364. 

Simmons, D. (1995). Working Paper #2: Developing a framework for National Environmental 
Education Standards. In Papers on the Development of Environmental Education 
Standards (p. 10-58). Troy, OH: North American Association for Environmental 
Education. 

Stapp, W. B., Bennet, D., Bryan, W., Fulton, J., Swan, J., Wall, R. & Havlick, S. (1969). The 

concept of environmental education. The Journal of Environmental Education, 1 ,1 ,30-3 1 . 

Stapp, W. B. (1976). International environmental education: The UNESCO-UNEP programme. 
Journal of Environmental Education, 8,2,19-25. 

Stapp, W. B., & SMEAC Information Reference Centre. (1978). From ought to action in 
environmental education: A report of the National Leadership Conference on 
Environmental Education. Columbus, Ohio: SMEAC Information Reference Center, the 
Ohio State University, College of Education and School of Natural Resources. 

Tozer, S., Violas, P. C, & Senese, G. B. (2006). School and society: Historical and contemporary 
perspectives. Boston: McGraw-Hill. 

UNESCO. (1977). Trends in environmental education. Paris: Unesco. 

UNESCO. (1978). Final Report: Intergovernmental Conference on Environmental Education. 
Paris: UNESCO ED/MD/49. 

U.S. EPA (Environmental Protection Agency). (1992). Federal Register, October 16, 1992. 
p.47516. 



77 



Vining, J., Merrick, M. S., & Price, E. A. (2008). The distinction between humans and nature: 
Human perceptions of connectedness to nature and elements of the natural and 
unnatural. Human Ecology Review, 15, 1 , 1-11. 

Volk, T. L., McBeth, W. C, & North American Association for Environmental Education. (1998). 
Environmental literacy in the United States: What should be, what is, getting from here to 
there. Rock Spring, GA: North American Association for Environmental Education. 

Wilke, R. (Ed.). (1995). Environmental Education Literacy/Needs Assessment Project: Assessing 
environmental literacy of students and environmental education needs of teachers; Final 
Report for 1993-1995. (Report to NCEET/University of Michigan under U.S. EPA Grant 
#NT90 1935-0 1-2). Stevens Point, WI: University of Wisconsin - Stevens Point. 

Wilke, Richard. 1996. Environmental literacy and the college curriculum. Global Issues Earth Day 
1996: Environmental Education. 15. 

Withgott, J., Murck, B. W., & Brennan, S. R. (2009). Environment: The science behind the stories. 
Toronto: Pearson Canada. 



78